This is the last post of; What is next?

I am sharing this post with ANTONIO

As part of the previous mentioned robot H9832B explanation, we must admit, the process of cleaning inside the duck: removing all cables, the motor, the speaker and the batteries was a lot of fun. The process of destroying and stripping our doll off,  gave a different value to the project, in other words, the idea of building a robot by the deconstruction of a small doll was fun. We had a lot of action along with the creation of the project.

To clarify what our H9832B project is; we relied on a scheme that was drawn on by our Professor William Turkel on 10/febrauary/2012.

figure 1

After Antonio and I talked about creating a robot that could be controlled by an Arduino platform, we went to talk to Professor William Turkel that ended up drawing the scheme of how it could be our design.

It was about being able to control the movements of the robot; in this case it is our doll, called H9832B duck, (Figure 1, to the right side of the image) by using a laptop which receives commands through internet, that is, through the application Twitter (Figure 1, to the left side of the image).

We wanted to control the robot with several engines (at the same time), in our case the “9 GR Servo”, and to do that, we started using a “Micro Mastro #1350, Battery”. This control of the six engines was done from its connection with Micro Maestro, and an external energy and programming system called Pololu Master Control Center.

figure 2 Antonio

After much time spent on understanding how to manage and program the “servos”, we realized that we really did not need to use Micro Maestro, because we could manage our H9832B duck by only using two “servos”. What really led us to use Micro Maestro was that we thought we needed more energy to move several engines which Arduino could feed. The truth is that it was a good experience in the sense that we learned how to move six servos at the same time and with different programming systems. Now that we have more knowledge, we can apply it for other future projects that we want to do.

Here you can download “Pololu Maestro Control”

All decisions we have been taking so far have been due to the ongoing work that my colleague, Antonio and I ,have been investing throughout this course. That means, while we were working with the programming systems of the various programs we were also working on the construction of the robot.

As we said before, we decided to use only two servos that were able to move all parts of the robot. Our first servo is able to move the two sides of the wings and the second servo is responsible for movement of the head and beak at the same time. We must also say that we have replaced the original duck’s eyes with two red LED lights, to give it a machine look.

The process of placing the engines and lights has also been fun. For the engines we have been able to attach them so that they fit well into the interior of the duck body by using threads and also, so they can rotate freely inside the body. Before the final placement of the engines we did many tests to make sure that they could move the parts we wanted. Each motor has a different movement, that is to say, we have programmed the angles according to fair movements without breaking anything. This work was not easy, particularly, the movement of the head for several reasons. We had to put two LEDs inside the head’s shell, one for each eye and this meant, looking for the perfect space for them to be able to move freely so that the parts would not affect the cables.

figure 3

The problem was we did not have much space inside the head’s shell as it was too small and also had a structural system that allowed to open and close its beak. Inside the head and after many attempts, we were able to place the two LEDs with their wires to feed the lights with energy. These two wires go out of the head and at the same time allow their rotary motion.

figure 4

After securing the light´s operation and the mechanism of rotational movement, we jumped to the next step, which was to tie the head to a servo by using a thread. This thread we could not tie it directly to the servo, because if we did, the servo would not be able to move the head. We had to tie the thread to the head, and then, pass the thread through a fixed part that existed already and then attach the wire to the motor; in other words we create a pulley system. At this point, and after seeing our system working, we started laughing because we were so happy we had succeeded.

figure 5

At this moment we had fixed a side wing to the one servo by a thread. This part was easier because we had more space inside the duck and therefore, the servo movement was not difficult. Tying the other wing was the last thing we wanted to do since that meant the complete closure of the entire body. After quite a bit of hard work and many programming errors, we finally could synchronize the different movements of the head, beak and wings. All this was done by connecting the various wires coming out of the duck into the Electronic Chassis Brick, and the Arduino UNO, while the Arduino UNO was connected to our laptop. The programming system that we use to speak from the laptop to the Arduino UNO is called WIRED.

Here you can download the software: http://arduino.cc/hu/Main/Software

Finally, everything was fine and worked perfectly. At this time we were happy and therefore, we were seeking to bind our other wing and close the entire body. We started to check everything. Before finalizing, we have to tell you another important element of our project.

All we are doing with the programming systems is to control the movements of our robot through a Tweet that is sent to the account we have developed since the beginning of the project, this account is @H9832B and the commands are:

– @H9832B $wings$: upon receipt of this command to our account on Twitter the duck flaps his wings three times as we have programmed.

– @H9832B $head$: upon receipt of this command to our account on Twitter the duck moves his head and opens-closes his beak twice as we have programmed.

– @H9832B $everything$: upon receipt of this command to our account on Twitter the duck first moves the head and opens/closes his beak twice and then moves his wings three times as we have ,as well, programmed.

Regarding the lights we decided they should always be on, but in a blinking light mode.

But how can we know who has sent a tweet to @H9832B?

Since we don’t want to open our Twitter account all the time, to see who sent the orders mentioned above; we decided to use an LCD screen that we have been able to program. Thus, whenever we have an order for our H9832B duck, our LCD is able to read and show the information about the user that sent the order and what type of order was it.

video 1

The LCD screen was always going to be a part of our project, even before we switched to this one, as was mentioned in the previous post.

What is interesting here is that we have achieved programming the LCD screen to read long messages by making the messages flow in motion.

Now, back to what we were saying when we started to check everything; that is, the movements of our H9832B duck along with the LCD screen. So… here comes the moment of truth… the result was disappointing. We came down and there was an uncomfortable moment of silence, basically, it did not work. Soon, I began to think that for every problem there is a solution, and the truth is, that having an IT man, such as Antonio, as my colleague made the problem less severe. This is because, he always has a solution to all problems, and therefore it was good having him as my co-worker.

The problem that arose was that the screen did not work well; it was blinking too fast, leaving strange symbols and we did not understand what was happening. We thought that by having all items plugged in at the same time, there was not enough energy for everything. Quickly, we contacted Professor William Turkel asking for another Arduino UNO, and by which, perhaps having an Arduino for H9832B duck and another one for the LCD screen. It was a very stressful time of reflection.

What can we do now? It was Friday and there are only few days left for the final delivery and we could not afford to lose time.

But then of course, Antonio saved our project again. The problem was that our LCD screen was plugged into the Electronic Brick Chassis BUS2. This BUS2 was sharing a pin with D9, D10, D11, and D12, which where, were the robot’s wires were plugged in. So we decided to change the spot of the LCD screen from the BUS2 to BUS1, where it did not involve the pin robot cables. It was a very intense moment, seconds seemed minutes, and at this very moment, there was another surprise waiting for us but in the positive sense.

Finally, our H9832B duck was moving accordingly to the commands and the LCD screen was showing the messages perfectly. So yes, there was joy again!

video 2

In this section, we will be delivering the more technical aspects of our project, which is as follows.

The H9832B Duck is a robot that obeys commands sent from Twitter. In order to make it work, we had to face some technological and mechanical challenges. The first part of this post will talk about the technological issue(s).
Let’s start introducing all the components of the project:

– Duck:


figure 6

– Liquid Crystal Display:

figure 7

– Arduino UNO:

figure 8

– Arduino Sensor Shield V4 Electronic Chassis Brick :

figure 9

– Laptop:

figure 10

This project had two main modules. The first module dealt with sending the tweet from twitter to the computer. The second one was responsible for the execution of the command in Arduino.

figure 11

TWITTER AND PYTHON

The first step to our project is to send a tweet to the duck. For this reason, we created its own account on twitter: @H9832B. Like every user, @H9832B, the account follows others, and in our particular case, “following” means that our duck accepts commands.

Every user can send tweets to the duck, but it only obeys those whom it follows. Following is the pathway by which other users are allowed to command the duck. The “following ones” do not necessarily have to be followed in tweeter. As long as the users follow the DUCK, they can command it.

figure 12

The next step is to send the tweet to the computer. Actually, twitter does not send anything to the computer. It’s the computer itself who searches for new tweets. We achieved this by programming a script in Python. Here is the script:
1º Accesses @H9832B’s twitter account
2º Gets its friends (followers) including itself (@H9832B)
3º For each friend gets their last tweet
4º Filters those tweets that are commands (they have a specific syntax)
5º Chooses the first one
6º Sends that command to the serial port (where it’s expected to be Arduino)
Usually, signing in a twitter account requires the owner of the account to access the twitter website through a web browser, and by which, they must insert their username and password manually. Can Python do that by itself? Well, of course it cannot. But there is a method that makes this process automatically. Firstly, we had to go to the twitter developers website.

Then, we created an application  by giving a name for it and its description. Once we created our application, we got the information for OAuth (http://oauth.net/). OAuth is an authentication protocol that allows users to approve application that can act on their behalf without sharing their password. Specifically the information required is: Consumer Key, Consumer Secret, Access Token and Access Token Secret. That information is supplied by twitter in the application itself.

figure 13

After creating our application and getting the relevant information, we installed the python-twitter library  on the computer. This library provides a pure Python interface for the Twitter API. This way, we can perform operations by using just the python code. A complete description of this API can be found here. Some of the methods we used are:

– import twitter //imports the python-twitter library
– api = twitter.Api(consumer_key, consumer_secret, access_token_key, access_token_secret) //creates the connection
– api.GetFriends() //gets followings
– api.GetUserTimeline(user.screen_name) //gets a user’s timeline

To send a command to the duck, the user must write a tweet with the next format: “… @H9832B … $command$ …”. The tweet must be sent to “@H9832B” and the command written between “$”. Right now, the duck can perform two operations and it understands three commands: “wings” to move its wings, “head” to move its head and beak, and finally “everything” to move its wings, head and beak. The python script checks every 10 seconds for new tweets. If there is a new tweet, the command (the text between “$”) is extracted and sent to the serial port, where the Arduino is supposed to be. The information sent to Arduino has the next format: “@user -> command”

figure 14

The other important issue concerning the python script is its communication with Arduino. Talking to Arduino over a serial interface is pretty trivial in Python. There is a wrapper library called pySerial that works really well. This module encapsulates the access for the serial port. The data transmission with the serial port is made byte by byte, therefore, to send a whole text, it must be sent character by character. The methods we used are:

– import serial //imports the pySerial library
– arduino = serial.Serial(‘/dev/ttyACM0’,9600,timeout=1) //creates the connection with serial port
– arduino.write(character) //writes a character to the serial port (or, in other words, sends a character to arduino)

ARDUINO AND THE DUCK

As a command is sent to Arduino, two things can happen: one, if the command is known, the duck obeys the instruction and the message “@user -> command” is displayed on the screen; if not, the duck does nothing and the screen displays the message “I don’t understand”. In addition, the duck blinks periodically. This behavior is possible thanks to all the electronic components involved. There are two LEDs (for its eyes) and two Servo Motors inside the duck, one for its wings and another one for its head and beak; and finally, an Arduino and a Sensor shield to control those components. A proper program running in Arduino is also required.

figure 15

Arduino is programmed in a language called Wired. This program is coded in the IDE Arduino, which you must previously install on your computer. When the program is done, it is sent and ran in Arduino. This program manages every component in our project.

figure 16

First of all, we needed to set up every component: pins, positions, etc. We would like to clarify that we did not connect the components to Arduino directly, we used the shield instead. Every time we say “connected to Arduino”, we mean “connected to the shield”. We have two LEDs connected to two pins in Arduino, two servo motors connected to other two pins and the LCD screen connected to the BUS1 on the shield. It is very important to check that pins that are being used by the BUS do not overlap with the pins used by the LEDs and the motors. Since the motors have rotating movements, it was needed to set up their positions in angles (degrees). So it was necessary to give the initial and final angles of the motion. We also defined special counters for the blinking of the duck’s eyes and some constants for the screen: number of rows, number of columns, speed of scrolling, etc. Secondly, the program starts with the setup() method. The project configuration is analyzed here:

– We set up pin modes and initial values (HIGH) for turning LEDs on, and attached the motors to their corresponding pins.
– We created the connection with the serial port (from where the computer is sending the commands)
– We initialized the LCD screen: size (2 rows and 16 columns) and the initial message “Hi, I’m H9832B Duck”.

Finally, the loop() method runs over and over again. The loop() is the heart of the program and its code controls the behavior of the duck and the screen. Specifically:

– It checks for new commands.
– It makes the duck blink if applicable.
– It scrolls the screen.

Scrolling:

We wanted that messages on screen to appear on the bottom row, from right to left. When it starts to disappear on the left, it then begins to appear on the top row, also from right to left. And when it disappears completely on the left, it reappears on the bottom row and the process starts all over again. The LCD library includes its own scroll() method, but its scrolls each row independently, so it did not offer the behavior we needed.

Our own screen_scroll_next() function does not actually scroll the screen. It just prints different messages once and again and by doing so, creating the effect of scrolling. To exemplify: imagine a screen with 1 row, 6 columns and the message “hello”, here is the way we would print the sequence of messages:

01: _ _ _ _ _ _
02: _ _ _ _ _ h
03: _ _ _ _ h e
04: _ _ _ h e l
05: _ _ h e l l
06: _ h e l l o
07: h e l l o _
08: e l l o _ _
09: l l o _ _ _
10: l o _ _ _ _
11: o _ _ _ _ _
01: _ _ _ _ _ _
The effect is that the message “hello” appears on the right and disappears on the left. Which is simple but effective! The message “scrolls” one character for each step on the loop.

The screen always shows the same message until an event makes it switch. In the case there is no tweet, the screen shows “Hi, I’m H9832B Duck”. When a command is received, the message shows the last user and command executed with the format “@user -> command”. If the command is unknown, the message that would follow is “I don’t understand you”.

figure 17

Blinking:

Approximately, every 5 seconds, the duck blinks. We have a special counter that starts at 0 and increments by 400 (0.4 seconds ) every step of the loop(). When that counter reaches 4800, the duck blinks once; when it reaches 9600, the duck blinks twice; and finally when it reaches 10000, the duck blinks three times. Then, the counter restarts again. This sequence creates a non-uniform and natural blinking.

We are aware that the red color gives a psycho-killer look to our robot, but that is the way we like it!

figure 18 servo motor

Commands:

In every step on the loop, the program checks for new commands so, it checks whether new information was sent from the serial port; in other words, whether there is new data available or not. Because the data transmission with the serial port is made character by character, we coded a specific function that would basically read the serial port character by character and then, put them together to create a whole message. The python script takes care of sending the messages with the format “@user -> command” but, it is the wired program that checks that the “command” is known for the duck. To make this possible, we codified another function called getInstruction(message) to extract the right side of the “->”. If the instruction obtained is known, then the duck will do the corresponding action. However, it is not, it will do nothing and the screen will show the message “I don’ understand”.

What happens if the duck understand the instruction? It can perform two actions: 1.) move the wings and move the head and beak. For this, what we really did was to write on the servo motors new values of angles. The motors have a repose angle (set up on the setup() method) and when the command is activated, this value is modified and a little piece of the motor rotates and goes to the new position. This new position is held on for the necessary time and then, the original value of the angle is restored. For some movements however, there is more than one repetition. For example, the duck moves its wings three times, that means, the program on Arduino switched the initial and final value of the angle three times.

When the motors move, their little pieces move the wings and the head. This process has been explained above, in the second part of the post.

figure 19

There were three small projects which helped us with our own. Here are the links for those mentioned projects,  in case you are interested:

– Tweet-a-Pot: Twitter Enabled Coffee Pot

– SimpleTweet_01 python

– Send a Tweet to Your Office Door: And let your coworkers know what you’re up to

What is next?

Before I begin with the explanation of the exciting project that I am doing with my partner antoniojimenezmavillard.wordpress.com/ I want to recapitulate about what we have been doing in class. While we are working on the main project, we are also practicing with Programming in the Processing Language, Vector Representation and 3D Representation.

To program the Arduino, we write code(s) on our desktop in a language called Wiring, compile it, verify it and then upload it to the Arduino, usually via a USB connection. As long as the Arduino is getting power, it will execute the code over and over again. In the code below you can see the code for blinking a LED

void setup() {               
  // initialize the digital pin as an output.
  // Pin 13 has an LED connected on most Arduino boards:
  pinMode(13, OUTPUT);    
}
 
void loop() {
  digitalWrite(13, HIGH);   // set the LED on
  delay(1000);              // wait for a second
  digitalWrite(13, LOW);    // set the LED off
  delay(1000);              // wait for a second
}

The best known vector/based drawing programs are Adobe’s commercial softwares Illustrator and CorelDraw, which helps us create a distinctive vector/artwork for any project.  In class however, we have been using an open source program called Inkscape instead http://inkscape.org/.

Being able to express ourselves by making pictures with a drawing program is a very useful skill for making sketches, storyboards, presentations, exhibit mockups, and the like. In this case we are working with two dimensional images in vector format, after which, we started learning to create their dimensional representations. CAD Computer Aided Design is an important industrial art tool, extensively used in many applications, including automotive, shipbuilding, aerospace industries, industrial and architectural design, prosthetics and many more.

CAD is also widely used to produce computer animation for special effects in movies, advertisements and technical manuals. It is used to design curves and figures in two dimensional spaces, or curves, surfaces, and solids in three dimensional spaces. Now-a-days there are a wide range of programs for working with 3D, ranging from animation (Maya, Blender) to engineering (AutoCAD, Solid Works, Rhino). Most of these are very powerful, and require months or years to learn how to use them well.

In class we are using Google SketchUp, a free, available and easy to use tool for creating 3D representations. The most exciting thing about using SketchUp in our class is that we can print our models in plastic using the MarkerBot Thing O Matic 3D printer that our teacher, William Turkel, just built with other students. http://www.makerbot.com/.

Building our project.

While thinking about what we could create for our project in class, History 9832B Interactive Exhibit Design (Winter 2012), we came across a bunch of ideas. One of these ideas was that our final project for the H9832B could be a Screen Office. What does this mean? Imagine that you are a teacher and someday you have office hours, or simply have to be at the office, but for whatever reason you are late or you cannot make it to your office. The problem in this case, if you have students waiting for you at your office, is that they don’t know that you are not being able to come in today. Our screen Office could be the solution, because it would allow you to send a Tweet from your cell phone or any computer to the Screen Office. Using Arduino, Ethernet Shield http://arduino.cc/en/Main/ArduinoEthernetShield and Arduino Liquid Crystal Library LCD Interface http://www.arduino.cc/en/Tutorial/LCDLibrary.

If you are interested in the Arduino history, you can see an Arduino documentary here: http://redux.com/stream/item/1909904/Arduino-The-Documentary-2010-English-HD

You can also download it here: http://www.archive.org/details/Arduino.TheDocumentary.English

You can see the trailer here:

For various reasons we decided to shift from this idea to another. Our final project, which we called H9832B duck, is about creating a duck robot with Arduino Liquid Crystal LCD. We would also be using Arduino UNO, Micro Maestro  #1350, Battery, LCD and Electronic Brick chassis, 9 GR Servo…etc.

The other day, Antonio and I, went downtown looking for some doll or puppet that could give us any idea for our robot. After more than 4 hours, we ended up by buying a small and cute duck for $25. The duck robot that my partner Antonio liked so much, well, I came up with the idea of destroying it and re- building it as an Arduino machine.

We want to see how a robot could be from the inside; we are interested in the idea of “the beauty of the machine”, so we want to show the reality of things. It is like opening a cute Mac laptop and seeing what’s happening inside this machine. As the Arduino team David Cuartielles said “…that’s open source hardware for me: it means to once again be able to check what is it that is inside of stuff, but in a way that is allowed; that is also ethically right, legal and that allows us to improve the educative methods. All things considered what open hardware is for me, is a system that makes people able to learn about the way things work in this world we live in, where there are more computers than people.” From Arduino “The Documentary” (2010)

At the beginning, must admit, I felt badly for my partner Antonio because he liked the duck so much, but now he likes this idea of opening up and showing what is inside of it.

What is next?

After the first course of Digital History 9808A that I took with Professor William J Turkel, now I am facing a new experience with my second course, which is called Interactive Exhibit Design 9832B. What we are doing now is using the Arduino platform.

The Arduino platform is an open-source electronic prototyping platform based on flexible, easy-to-use hardware and software. It is intended for artists, designers, hobbyists, and anyone interested in creating interactive objects or environments. (This definition was taken from the Arduino website.)

Here we can see an example of what we can do with Arduino: it is an Arduino Robot. (Video).The Arduino is using a simple stop and start code. Most of the information about this can be found here:http://web.mac.com/michaelalves/Arduino/Step_1.html

In the Exhibit Design Course previously mentioned, I am working with my colleague Antonio.  I must admit, I have always preferred working in a group rather than working alone. Particularly in this case, because it is a good way to improve our understanding of how Arduino works and how we can be more creative; especially in the sense of getting interesting outcomes.

While we are thinking about what we can do with these small computer pieces that Professor William J Turkel has provided us, we are also learning in class about how to handle Arduino. The first class was an introduction to Arduino and we had to download the Arduino Software form here: http://arduino.cc/hu/Main/Software. By doing this, we can make connections between the Arduino platform, the small pieces and the laptop.

First class

Our first step was to create the Blinking Light by coping and pasting the Blinking Code that we found here: http://arduino.cc/en/Tutorial/Blink. Then we did the second example, which is called Pushbuttons; can be found here: http://arduino.cc/en/Tutorial/Button. The last one was the Button State Change Detection http://arduino.cc/en/Tutorial/ButtonStateChange

Second class

On our second class, we did more examples, such as Digital Read Serial. This example shows you how to monitor the state of a switch by establishing serial communication between your Arduino and your computer over USB: http://arduino.cc/en/Tutorial/DigitalReadSerial

Then, we started with Analog Read Serial. This example shows you how to read analog input from the physical world using potentiometer: http://arduino.cc/en/Tutorial/AnalogReadSerial

The last thing we did in this class was an Electronic Bricks Cookbook and then, we tried the library for Liquid Crystal to control the LCD. Our attempt was a complete failure : http://www.seeedstudio.com/wiki/images/a/a9/Electronic_Bricks_Starter_Kit_Cookbook_v1.3.pdf

READING ANY LANGUAGE

It would be a tool that would increase reading among people and it would also make smaller the distance between all of us. We would understand and respect more one another, and most likely have a better world, one with less wars and more love. We live and share the same place but we tend to lock ourselves in our communities and we do not make the effort to know other cultures. We do not have to understand the different behaviors of different cultures but we have to respect them.

Perhaps all of this has to do with the lack of communication between us. We do not know how to read different languages and most people cannot communicate with others, therefore we do not know why we are different. Imagine how powerful it would be to be able to read a book, a newspaper, an article, a magazine or any other text in any language with the simple use of glasses or contact lenses.

I mean, these new lenses have the ability to recognize any text in any language and instantly translate to the desired language by the user. Yes, I am talking about those sunglasses and lenses that many people wear to protect their eyes from the sun or to see without difficulty. This idea of using lens technology is not new at all.

Babak A. Parviz, a bionanotechnology expert at the University of Washington in Seattle, has released a report detailing just where the bionic eye technology (or at least the augmented vision contact lens) is at right now, and it is pretty amazing. It can translate or capture speech in real time, augment video game playing, and allow users to view graphics without a physical display device. There are still some obstacles to overcome before the lenses can be released for use. Most notably, the components will have to be made extremely small as well as being transparent, so that users can still actually see their surroundings while wearing the lenses. There is also the problem of making all the components safe for long-term usage in the eye. Mr. Parviz seems confident that these challenges can be overcome though, and assures that the augmented vision display will one day be as ubiquitous as iPhones.

http://gajitz.com/bionic-vision-cybernetic-lenses-with-heads-up-display/

Project adds another dimension to campus

Western News

http://communications.uwo.ca/western_news/stories/2012/January/project_adds_another_dimension_to_campus.html

By Adela Talbot

January 05, 2012

Mohammed Afana is putting The University of Western Ontario on the map.

Literally.

A first-year PhD student in Hispanic Studies, with research focused on digital architecture, Afana is developing a 3D model of campus for Google Earth that will serve as a stepping stone for a mobile application making Western accessible to anyone, anywhere, anytime.

“The new experience is not physical but virtual. There is a new kind of urbanism; you can now visit different cities (through) a computer,” Afana says.

Mohammed Afana, a PhD student, is developing a 3D model of Western’s campus. The model will be a stepping stone for a mobile app meant to revolutionize time and space management at Western.

A holistic, virtual Western experience is the ultimate goal of the 3D-campus project, he explains, and one of many novel initiatives coming out of  The CulturePlex lab, spearheaded by Juan Luis Suárez, a Hispanic Studies professor.

Afana, who can be spotted around campus with a camera, takes photos of Western’s buildings and imposes them on 3D models he builds in Google SketchUp. He has created models of nine buildings so far and is halfway to completing his part of the project.

Meanwhile, computer science students are working with the lab on Phase 2 – developing an ‘augmented reality’ mobile application that will pair 3D models of campus buildings with the university’s internal scheduling system.

“The idea is to (be able to) access the university from your cell phone,” explains Camelia Nunez, a linguistics PhD student who works with Suárez in the CulturePlex lab.

“Once everything is on the map in 3D, we are planning to design an app to allow you to orientate yourself on campus. With a smart phone, you could point it at a building and it would tell you what building it is, and if you’re trying to get to another building, it will tell you how,” she says.

The app would also be connected to the internal scheduling system of the university, she explains. So you could scan a room’s door and the app would tell you the schedule for all the classes in the room.

Nunez says the lab isn’t aware of another project like this, and that a 3D model of Western’s campus – and, of course, the app itself – would likely be a first for higher education.

While the 3D models are the first step of the project, the possibilities are limitless, Suárez says.

For now, the project and the app will focus on making life easier for Western’s students, faculty, staff and visitors.

“Universities like Western, whose priorities are to provide the best student experience, could benefit from the latest technology in mobile services (being) connected to the timetable and facilities admin systems,” Suárez says.

The app will keep everyone informed of events and create an easier approach to managing time and space on campus.

Suárez says the idea was conceived in the lab following a multidisciplinary chat about Google Earth. The project is a good example of the lab’s vision of research and development, followed by service and commercialization.

While the completion date for the project depends largely on funding and programing of the app, Afana believes everything should be done by April.

“If you look and see London on Google Earth, the downtown is the only thing in 3D. There’s not much, and it’s just the outside of the buildings. But if you find the campus, you can see (3D models) live already,” Afana says. “When we are done, anyone could visit campus (inside and out) from anywhere.

“It’s a big project, but it is a good idea and it will make the university different.”

Eaton´s Catalogue 1913-14

The Eaton´s Fall and Winter catalogue looks like the Architects Data by Ernst and Peter Neufert because both of them provide quite enough information about one product.

Starting on page n282, I found a list of books that Eaton´s was selling that season; I chose the following books with the purpose of find full online digital copies of them.

Dr. Gunn´s New Family Physician (Gunn, John C) What a Young Wife ought to know (Emma Drake) Canada from Ocean to Ocean (J. Lawlor Woods) The Art of Sign Painting (Frank H.Atkinson) The Money Moon (Jeffery Farnol) The Missing Bride (E.D.E.N. Southworth) The Pathfinder (J. Fenimore Cooper) Modern Carpentry (Fred. T Hodgson)

Here are the results:

     

Dr. Gunn´s New Family Physician appears like Gunn´s household physician, or, Home book of  health. An encyclopedia of health hints. Everything explained in plain Language. Tells how to avoid sickness, what to do in emergency, and how to care for the sick. Gives causes, symptoms and treatments of various diseases.

What a Young Wife Ought to know.

But, you can also find it  out here

Canada from Ocean to Ocean. A splendidly arranged view book, showing 65 excellent views of various parts of our Dominion.

The Art of Sign Painting. I could not find it, but I found a similar book by the same author (original from University of Wisconsin). A book by Frank H. Atkinson. A book of thorough instruction in practical sign writing. Illustrated in a plain and practical way.

The Money Moon. Fiction

But, you can also find it  out here

—————————————-

The Missing Bride. Novel

But also it is audio

The Pathfinder. Novel

Modern Carpentry. In tow volumes, a practical manual: 530 pages

Intelligent Search Engines for Images

 

One of my most important experiences on the internet has been to search for images and maps on line. I had big databases and I had to tag many images from all around the world but, the problem was, that I had no information at all, that’s to say, no names, no location…etc. Therefore, what could I do if what I had were only images.

 

Nowadays, there are different engines that allow this kind of search. It depends on what kind of data you have. The process of search is easy and exciting.

 

You can use Google Image, TinEye, Retrievr, Google Map (photos), Flickr. All of these engines will help you to find what you are looking for; it is like using Google to find any information.

 

For me, Google Image and TinEye are very similar, because I can find the same results and with the same process. Therefore, I will explain how TinEye works. The goal is to help you find the author of an image, where it came from, where it is being used, where you can find additional images from the same author and/or any other information that might be useful; for example, to find higher resolution versions.

 

The www.tineye.com offers two methods of search: by URL or by uploading an image.

 

To search by URL, just paste the address of the image that you are looking for, for example:  go to www.tineye.com and paste the below URL of an image


You must have gotten the below results: 22 Results. Searched over 2.0678 billion images in 0.08 seconds.

To search by “upload” (this is, search for an image on your local hard drive), just click on the Browse button and locate the file that you are looking for. For example: I had this image on my computer but I had no information at all.

I got the below results: 1 Results. Searched over 2.0678 billion images in 1.213 seconds.

You can see the difference between the original image and the image that I found.

In this case, it turns out that http://www.google.com/imghp?hl=en&tab=wi  is more sophisticated because it gives me more information about this particular image. About 35 results (in 0.56 seconds)

I usually use these engines of search and I realized that http://www.Flicker.com is used like a database for these engines.

TinEye has a plug-in for Firefox, for Chrome and for IE. For the plugin, simply right-click on any web image (it won´t work with images on your local hard drive) and select TinEye from the context menu. You will be whisked away to the Tineye website to see your results.

TinEye can find the largest image, or the most transformed image in a set of search results, simply take advantage of the SORT option on the left of the screen.

Best Match is the default sort option and shows the images that are closest to your original image first.

Most Changed shows the images that are the most transformed from your original image first.

Biggest Image shows you, you guessed it right, the highest-resolution version of the image you searched for results first.

The Compare tool allows you to quickly switch back and forth between your result image and your original image. This animates any differences between the two images, making changes easier to see. It is especially handy if you have sorted the images by Most Changed and your image matches have been heavily edited.

If you would like to share your search results with others, you can easily do this via Twitter, Facebook, or pretty much any other social network. Just use the Share tools on the left side of the results screen.

These are fantastic and intelligent search engines for images. But I would mention a different engine for images.

Retrievr (http://labs.systemone.at/retrievr/ ) is an experimental service which lets you search and explore in a selection of Flickr images by drawing a rough sketch. Currently the index contains many of Flickr´s most interesting images.