Intro

Aenean ornare velit lacus, ac varius enim ullamcorper eu. Proin aliquam facilisis ante interdum congue. Integer mollis, nisl amet convallis, porttitor magna ullamcorper, amet egestas mauris. Ut magna finibus nisi nec lacinia. Nam maximus erat id euismod egestas. By the way, check out my awesome work.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Duis dapibus rutrum facilisis. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Etiam tristique libero eu nibh porttitor fermentum. Nullam venenatis erat id vehicula viverra. Nunc ultrices eros ut ultricies condimentum. Mauris risus lacus, blandit sit amet venenatis non, bibendum vitae dolor. Nunc lorem mauris, fringilla in aliquam at, euismod in lectus. Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. In non lorem sit amet elit placerat maximus. Pellentesque aliquam maximus risus, vel sed vehicula.

Student's Choice: Omnidirectional treadmill

Navigation in VR and Omnidirectional treadmills
Virtual Reality is a very peculiar technology, its history starts in 1950 and it has evolved since then. Nowadays, we have great VR applications that run on powerful machines but also on cheap smartphones, we have powerful headsets and other devices that are able to create incredibly immersive experiences in the virtual world and track our movements in the real one. Anyway, there is one problem that still persists: the limited space in the real world where the user can move.
This is a very big limitation for the users that want to immerse in a virtual world where they can do whatever they want.
Until now, virtual world’s designer use various tricks to allow the user to move in the virtual world that is usually much greater that the actual space that is available in the room where the user is. We saw some example during classes.
In some VR games the user cam move his avatar using a pad on the joystick, this is probably the most intuitive way to solve the problem because it is easy and intuitive (and most people who have played normal videogames are already used to it) but this type of movements control used in a VR application can lead to simulator sickness (if you are interested in this topic here you can find some information and some interesting link in the references: https://en.wikipedia.org/wiki/Virtual_reality_sickness).
The other most used way that has been used in lots of application is “teleporting”, with this the user can point a place where he wants to go, and he will be teleported right there. This is an easy way to solve the navigation problem but in my opinion it is worst one because it is very annoying and it makes all so strange that the users lose the sensation of immersion given by all the other component of the virtual application.
Looking at this examples people have understood that the right way to create an immersive application is to give to the users a way to move their body when they have to move in the virtual world, some interesting ideas are implemented in applications where you can move using some gestures that mimic the normal movements.
Anyway, unfortunately, none of these methods work very well, they are suitable to create an experience which is completely immersive.
A final solution for this problem is provided by the omnidirectional treadmills. “An omnidirectional treadmill (ODT) is a mechanical device, similar to a typical treadmill, that allows a person to perform locomotive motion in any direction, allowing for 360 degrees of movement. The ability to move in any direction is how these treadmills differ from their basic counterparts (that permit only unidirectional locomotion). Omnidirectional treadmills are employed in immersive virtual environment implementations to allow unencumbered movement within the virtual space” [1]
This new kind of devices is the key to obtain a very immersive experience where the user will be able to move and explore the virtual worlds with complete freedom and without the need of having a big space in the room where he is. Moreover this technology will probably eliminate most of the scenarios where the users hurt themselves during the VR experience because many of these situations are caused by the users who moves too much in a too small area (some example are shown in this video: https://www.youtube.com/watch?v=0KcllPEe8y8).
Each implementation of the ODT uses a different type of technology, but there are two main approaches in the designing of this kind of devices. The first one is to create a large base where the user can move with complete freedom and the ODT needs some sensors to understand the movement of the user and keep him in the center of the platform moving the treadmill in the right direction. The second approach is create a device where the user has to wear a sort of harness to transmit his movement to the elaborate unit of the ODT, this approach obviously has some disadvantage for the user because it constrains the movements that the user is able to do and it is also less comfortable, but, on the other hand, the devices that are designed using an harness require less space because it is easier for the device to keep track of the movements of the user, so it can adapt faster than in the other case and the harness helps the device to keep the user in the center of the platform mechanically (not to mention that the harness can be used to give some feedback to the user during the VR sessions of some application).
There are many different implementations of the omnidirectional treadmills. Here I want to present the more famous ones.

Some implementations
The first ODT that I want to mention is the one that is now very famous thanks to the movie “Ready Player One” that many people have seen in this year. The movie “takes place in 2045, when much of humanity uses the virtual reality software OASIS to escape the desolation of the real world” [2]. In this movie the people can do everything in the OASIS world, they are completely immersed in that reality. The main character use three interesting devices to obtain this kind of immersion: a wireless VR headset, Tactile gloves and bodysuits and a omnidirectional treadmill. The last one is the most interesting because the device that has been used is a device that actually exist, and it is the Infinadeck. This device is awesome, it “is an active omni-directional VR treadmill that lets you walk in any direction. Compared to others, which generally rely on low friction surfaces to emulate the act of walking, Infinadeck’s design actually has a tread that moves beneath you, leading to a much more natural gait” [3].
[from https://www.youtube.com/watch?v=SVs7iegtDIk] [3]
[3]
In the first releases of this device it needed a human that manually checks the user’s movements and sets the treadmill to move in the opposite direction. But now the treadmills is almost ready, it can, with a ring around the user, recognize his movements and reacts to them. The company has also started a beta program for developers and tester.
[5] Another interesting device that has a completely different approach is the Virtuix Omni. This device has no moving parts in the base that keep the user at the center of the platform, but he will never move from the initial position because “the surface is bowl-shaped and requires special low friction shoes or shoe covers. It uses inertial sensors to track a person's position, the length of their stride, and how fast they are moving. The information is then sent to a computer which translates the data into the game movements. The updated Omni Harness design keeps the player stable in the Ring, without needing leg straps. As it rests on top of the support Ring, the player is able to turn rapidly while walking, jogging, or running, without the user having to rotate the vertical support along with them. Since its first release, the motion tracking has also improved, allowing for a wider range of player speed” [6].
If you are interested in other example of this type of devices, then you can find a list of other interesting omnidirectional treadmills at the link [5] in the resources at the end of this page.
Now I want to take a look to some interesting applications of this type of technologies.
Omnidirectional treadmills can be used to make movies, such as in conjunction with the green screens that are used nowadays to add particular computer graphics to the movie because the ODT would allow limitless movements by the actors that are in front of these green screens.
ODTs can be used for military and sport training, in particular the Virtuix Omni has been already used for these kinds of applications.

Medical study and treatment of FOG
One of the most important application of Virtual reality is its use in healthcare and clinical therapies. It is used in medicine for training (like surgery training) but also for some therapies like pain management, anxiety disorder treatment and rehabilitation. Rehabilitation techniques, in particular, have used VR for decades now but with the new technology of the treadmills it has new opportunities.
In particular, an interesting research has been made on a therapy that uses a VR-based body-weight supported treadmill interface (BWSTI) to investigate Freezing of gait (FOG) which “is the most common yet poorly understood gait phenomenon found in Parkinson’s disease. It is defined as an episodic inability to generate effective stepping” [4].
The idea behind this research is to use the designed treadmill to have a safe platform that is under the control of the researcher to test this strange behavior of the Parkinson’s disease. They make the patients walk on this treadmill, the patients have a harness which is connected to a body-weight support system which helps them to move without danger and helps them moving because it supports a constant amount of body weight. There are some reflective markers on the shoes and on the harness on the patients that are used by the motion capture cameras to collect in a precise way the data of the movements.
[4] During the tests the researchers use the data of the movements of the patients to adapt the speed of the controller of the treadmill in order to follow the movements of the person and to recognize the case in which the freezing of gait manifests. To test all the possible scenarios in which it could be manifested, the patients wear a virtual reality headset that provides realistic visual stimuli that might cause FOG.
Nowadays there is no an effective treatment for the freezing of gait, this study was made to understand which are the external stimuli that are the cause of it and in order to understand if some “external environmental stimuli such as visual, auditory, and somatosensory can be used therapeutically to break the FOG cycle and improve gait-related mobility” [4].
The study has been carried out on three patient that have showed the problem in the past, and the researcher were able to reproduce the problem on two of them thanks to the VR-stimuli.
From the study it turned out that the training in the controlled environment with VR and the equipped treadmill was beneficial to the patients, improving their gait-related mobility. Moreover they proved that the use of Virtual Reality with a specifically designed treadmills can safely and realistically reproduce real stimuli that the patients can encounter in their real life. This is very useful for the study of this type of diseases because now the researcher can reproduce the problem in laboratory and train the patient to improve their ability in those situations.
So, at the end, this type of technology opens great opportunities in a great range of situations, from virtual reality games and applications to training of military and medical researches. Thanks to this device it will be possible, one day, to recreate some completely and very immersive virtual worlds like the ones in the science-fiction movies that everyone wish to visit.

Slides for presentation at: https://atrica2.people.uic.edu/doc/ChoicePresentation.pptx

References:
[1] https://en.wikipedia.org/wiki/Omnidirectional_treadmill
[2] https://en.wikipedia.org/wiki/Ready_Player_One_(film) v [3] https://www.roadtovr.com/infinadeck-2018-prototype-hands-on-most-natural-feeling-vr-treadmill-yet/
[4] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3701801/ this is the original link of the article, unfortunately sometimes it doesn’t work so you can find the same article reposted in these other sites: https://europepmc.org/articles/pmc3701801 & https://ieeexplore.ieee.org/document/5975463/authors#authors
[5] https://www.vrfitnessinsider.com/vr-omnidirectional-treadmills-making-gains-towards-full-immersion-and-cardio/
[6] https://en.wikipedia.org/wiki/Virtuix_Omni
[7] https://en.wikipedia.org/wiki/Applications_of_VR

VR/AR platforms pros&cons

(Image from https://www.pexels.com/photo/electronics-grass-lawn-modern-532559/)

In the first week of the Virtual & Augmented Reality course we tried some different VR & AR demos. Here, I want to briefly describe my experience and what I think about those different technologies.

First, the CAVE2: it is a large room where I tried some very immersive virtual-reality applications, including a virtual trip over Mars and two very interesting applications used in medicine to observe some chemical reactions in a new immersive way. The experience in the CAVE2 was the most immersive that I tried, because I was in a huge room and without any heavy headset but only a pair of comfortable and light 3D-glasses and this gave me a feeling of real immersion. The only con is that, probably due to the extremely immersive experience, some demos in the CAVE2 gave me a light sensation of motion sickness.

The NVIDIA VIVE head mounted display: the VIVE headset is a fantastic device, the technology is mature, and the demos were very immersive. I've always been a fan of video games and the game that I’ve played with the VIVE really surprised me, the definition of the video was excellent, high frame rate and the controllers were very precise. In my opinion this is the best way to play videogames, even for someone, like me, who has always been used to playing with a controller sitting on a chair. To be free to move in any direction with my own body as a controller for the game is amazing. However, the VIVE headset is not only for games, and I had a proof of this with the demo of Google Earth in VR. This demo was, if possible, even better than the videogame. The sensation of freedom was absolute, I could go everywhere on the planet and when I visited NY on in the VR it seemed that I was there, flying between the skyscrapers. I have also to mention a remarkable and useful feature of this headset: the “Chaperone system” that is the technology that allows the VIVE to warn you about any large obstacle that is close to you and this is crucial to avoid unpleasant incidents during its use. Obviously, the HTC VIVE is not a perfect technology, it is a quite weighty headset and the problem is not that it is to heavy for the head but rather that I had the sensation that, sometimes during the movements, it could fall from my head. Thus, the only con that I found in the VIVE headset is in its straps and not in the technology.

The Microsoft HoloLens: I was very excited when I saw the HoloLens, I wanted to try it since I saw the first presentation of this technology some years ago. The HoloLens is a headset for the augmented reality and it is very different from the VIVE because the Microsoft’s headset shows virtual objects in the real word and not in a totally virtual environment. The very important characteristic of the augmented reality is that it’s very difficult to have incidents because you can see the real world around you and the Microsoft HoloLens is the only headset that uses virtual reality and that is capable of giving you a sensation of real immersion in the augmented world. Nevertheless, I was a little disappointed by this headset. The HoloLens was quite heavy and uncomfortable, especially for people like me who wear glasses, and this ruins my experience with it.

The last demos that I tried were common smartphone AR & VR applications. The VR demos were interesting, it’s surprising what is possible to do with a simple smartphone today and a cheap headset like the Google Cardboard or the Google Daydream. Even if the experience with the VIVE was much better (obviously), they gave me the sensation of immersion in the virtual word. The demos that showed 360° videos with the smartphone were very immersive too, in particular I was fascinated by the video of an eclipse. The AR demos were even better, the smartphone is probably the most suitable device for AR applications (even better than the Microsoft HoloLens). The applications ware very funny and I have particularly enjoyed Spacecraft 3D.

Virtual Reality Week2

The picture that you can see on the top of this page represents a simple example of augmented reality made with the Unity3D and Vuforia.

It is a very simple application of this technology, but it gives the idea of the potentialities of Augmented Reality. That example is a simple model of an astronaut standing over a printed image, but it gives an idea of the potentialities of AR applications.

I used for this experiment the little camera of the smartphone. The virtual objects are visible only on the display of the computer.

The sensation of immersion in this mixed world is not so strong due to the use of an external camera connected to a laptop instead of a more suitable device, like a pair of AR glasses (like the Google Glass) or something like the Microsoft HoloLens.

Nowadays, smartphones are the easiest way to use an application of Augmented Reality. Everyone has a smartphone with decent camera and processor and this is enough to run a simple application.

There are thousands of cases in which an application of this type on the smartphone can be helpful. Think about when you need to furnish your home and you want an idea of a piece of furniture in your living room: with a suitable app, it is enough to take the smartphone, choose the piece of furniture and frame the room, the result will be visible on the screen of the phone. Another good application of this technology is to try clothes in a very easy way: it would be enough to go in front of a big screen which is connected to a camera and a piece of software can show your image on the screen with the selected clothes on you.

Anyway, with a pair of AR glasses, you can have a totally different experience, because, with them, you can see the virtual objects directly projected in front of you in the real world, without using an external device which shows in a display the virtual objects. For this reason, they will help you to be completely immersed in the mixed (virtual & real) world.

In the future, probably, everyone will have a pair of AR glasses and people will live their lives immersed in this world where they won’t be able to recognize the difference between real objects and the virtual ones.

Many objects of common use could be replaced with virtual ones, just think about the television or all the ornamental objects that we have in our homes: if all the people will use AR glasses in their lives then there will be no necessity to have this kind of real objects, the virtual ones will be enough. Another good example are books that can be replaced with pages directly projected in front of you.

I know that this scenario is more like a dream and that it can scare some people, however a good aspect of this will be the ecological impact: more and more object will be replaced with virtual ones!

In conclusion, I’m very enthusiastic about this technology and I cannot wait to see how it will grow in the future.

(Images taken by myself)

GoogleTranslate & AR

As we have already discussed in the previous pages of this site, augmented reality has a lot of incredible applications. Here I want to talk about one of these applications, a particular feature of Google Translate. The feature I’m talking about is the one which allows you to frame a test with your smartphone’s camera and to see, in real time, the translated test over it: this is augmented reality applied to translations!


(All the images are taken by me using the Google Translate app).
As we can see in this example, the app can recognize a hand-written text too.
The idea of this feature dates back to 2010, it was the main functionality of an app called Word Lens. Afterwards, Google bought Quest Visual (the company behind the application) and integrated it in Google Translate. This led to a more precise ability to translate text, I have tried Word Lens before the acquisition by Google, and its translations were not so good.

Now, the translating ability of this technology are pretty good (Google is a leader in this area). But unfortunately, some problems of the first app regarding the recognition of the text from the real world in real time and the feature that shows the translated text over the original in augmented reality are still present.

To get a decent result it is necessary to stay still and frame the text well, otherwise the results can be disappointing and puzzling. But, even in the worst case, the app gives you some clues and you can get an idea about the content of the text.


In this other example it is clear that this technology can fail its work. Here the translated text is puzzling and the words overlaps. This can be very frustrating if the user uses a pair of AR lens and he can’t switch of the application! Therefore, in addition to not translating the test in a correct way, this application can also confuse the user in some situations.
The idea behind this application is incredible and it is the perfect usage of augmented reality! I image a future in which everyone has a pair of glasses of AR with this technology built inside and a pair of headphones that translate every speech in real time (a similar feature is already present in Google Translate). This will be a wonderful technology that will help people when they will be visiting other countries. But the use of it is not limited to tourism, a translator in augmented reality (if it is well implemented) can be of great help for students too! To understand the impact that it can have it is enough to think about all the student that have to study some particular argument for which there is no a book in their language. With a pair of glasses that translate it using augmented reality they will be able to study from books of any author without problems.

Even if this type of application of the AR technology will be very important in our future thanks to the widespread use of AR glasses or even AR contact lens, it is important that the user will always have the control of the technology and that it won’t work in a completely automated way. Taking the Google Translate technology as example, it is crucial that it is the user to switch on the AR translation because sometimes probably he will prefer to see the real world as it is and not only with the translation over every text in front of him.

In conclusion, I think that this application is a very important example of how augmented reality can really help people in their real life but for now it is “more like a demo” (even if it is a very good one, thanks to the power of the modern smartphones) of what it will be in the future when it will be used through a pair of AR glasses like the Google Glass!





AR SkyView

(Image from https://www.pexels.com/photo/blue-and-purple-cosmic-sky-956999/)
“Humanity's interest in the heavens has been universal and enduring. Humans are driven to explore the unknown, discover new worlds, push the boundaries of our scientific and technical limits, and then push further. The intangible desire to explore and challenge the boundaries of what we know and where we have been has provided benefits to our society for centuries. Human space exploration helps to address fundamental questions about our place in the Universe and the history of our solar system” (from www.nasa.gov/exploration/whyweexplore/why_we_explore_main.html).

All humans have always been attracted from the sky, since our ancestors moved the first steps on the Heart. Humans were used to look at the sky to find the right way during their travels (using fixed stars like the North Star) or find their gods.

The captain of the space ship of the famous tv-show StarTrek used to say: “Space: the final frontier. […] To explore strange new world. To seek out new life and new civilizations. To boldly go where no man has gone before!”. Since I was a child, I have always been fascinated by these words, there is no man that doesn’t feel anything looking at a starry sky at night. During summer I love to go in the countryside where I can find places with only a little light pollution and where I can take photo like the following one. Looking at that wonderful scene makes me feel thankful for my life, all my problems lose their importance in front of the hugeness of the universe.
(Image taken by me)
The only problem that I have when I look at the stars in the sky is that I’m not able to recognize them! I don’t know the constellations, how to find a particular star or planet, where I have to look to see distant galaxies. And, at this point, a new application of augmented reality comes to our aid: the technology that allows you to point your smartphone at the sky and see all the information (like name and exact position) of stars, planet, galaxies and constellations, on the screen of your phone, directly overlaid on the sky framed by your camera (thanks the augmented reality). There are lots of application of this type, in particular I used SkyView for Android smartphone (all the following images are taken by me using this app).

The application of augmented reality is perfect, here AR is not used only to create a more immersive experience but too improve the effectiveness of the map too. Unlike a street map, a map of the sky is very difficult to read and to orientate among the stars can be hard. For these reasons, this kind of applications is very useful, thanks to the gyroscope and the GPS it can retrieve your exact position and the direction you are looking at, then thanks to the phone’s camera it is able to show the map of the sky directly over the image taken in real time. In this way it is very easy to understand at which celestial body you are looking at.

The only limit of this application is that the sky seen from a smartphone camera is definitely worse than the real one, so this is another application that can be improved with a pair of AR glasses because with those glasses you can see the sky directly with your own eyes but still have all the information of the app projected on it. Using an application like the Google Assistant (in particular the Google Lens feature) with these glasses you could ask to the assistant “'what is that?” while you are looking at a star and it will replay with its name and a description taken from internet and at the same time it will show on your eyes the constellation of which it is part.

This application is also very useful to learn new things about the sky: every time you touch over the figure of a celestial body it will show you lots of information about that and also the link to the Wikipedia page to give you the possibility to deepen the topic. It is a very accurate map (which is its main function), it has a night mode to make the view on the phone’s screen more comfortable during the night. The view is well implemented too, there are lots of options to decide what to show on the screen and this give to the user the ability to completely personalize his view. In my opinion, it can be improved adding some more information about what is show on the map taking the information from a source like the official site of the NASA (and not only from Wikipedia). Taking information from the site of NASA will allow the user to learn some more accurate and scientific information about space and it will also allow to show the latest news about space exploration. (Image taken by me using the SkyView app)
Here, for example, I taken a screenshot of the app looking at Saturn and at the constellation of the Scorpius: the planet is showed as it is and not as a point with a label (I think that it is very educational and the constellation is represented in a very good way, showing the stars, the link over them and the scorpion figure to understand better the origin of its name.
(Image taken by me using the SkyView app)
This is the moon and the following images show some information that the application provides you. (Image taken by me using the SkyView app) (Image taken by me using the SkyView app) (Image taken by me using the SkyView app)

Project 1: reviews

(Image from https://vuforia.com/blog/unity-partnership.html)
In these days we’ve presented our augmented reality projects. Here I’m reporting my impressions about two of them.
The first project that I’ve tried is the one of group 19 (Santambrogio, Bellini). Link: https://cs491.albertomariobellini.com/projects/1/
I have a very good impression of the project since the beginning. The aspect that most impressed me is the high quality of the design of the homemade models. These models have good design features and they looks very well during the AR experience of the project. It’s really difficult to say which are the most beautiful, here I’m attaching the images of the 2 models that most fascinated me: the USB pendrive and the “moka” model.
The goal of the project is to create some interesting AR scenes around breakfast’s objects like cereal boxes, cans, placemats and magazines.
The scenes present in the project are very interesting and the quality of the models make all the scenes more immersive. Moreover, they implemented some useful interaction between object that combine the immersive and funny part of the mixed world created with AR technology with the healthy purpose of make people aware of what they are eating. The application shows this type of interaction when the user but a cereal box or a can near one of the two placemats.
The interactions of the Super Mario’s placemat are very funny. Initially, the scene has a model of Mario who jumps in front of the user with 3 balls floating over the scene, I appreciate the combination of an Italian character like Mario with the free ball that at the start of the scene have the colors of the Italian flag. When something to eat or to drink is close enough, the free balls change and show the nutrition information of that object.
The other placemat, the one with the television and the chef is my favorite. The model of television and the audio clips of a tv-show make the scene realistic and immersive. Here the interaction between the placemat and the cereal boxes /cans is even more interesting: as you can see from the screens below the television has initially a screen with the title “breaking news” but as soon as you put something to eat or drink near it then the nutrition information will appear directly on the tv in place of the news.
There is a lot of work behind this project and the proof of this is the great number of interactions in the project. They are quite simple but each object behaves differently if it is next to a cereal box or a cans or one of the homemade targets. The last thing that I particularly enjoyed is their idea for the magazines. They made two magazines, they have articles about the two main argument of this course: Augmented and Virtual Reality. The things that I have appreciated are the animation of the moving pages and the way in which they implemented them. The movement of the pages is very realistic, and it wasn’t so easy to do, only few projects have implemented it in a realistic way. The other thing is the way they managed the interface for the user: the two buttons to flip the pages. The user needs two buttons (one to flip the page to the right and another to move them to the left), in many projects they are over the magazine’s pages, but this is uncomfortable for the user because they cover partially the test (even if they are transparent). They managed this problem using the target only as a platform, and not like a real page of the magazine, and putting over the platform (which is dark to create more contrast with the white pages of the book and so making them more readable) the two buttons and immediately above them the virtual magazine.


The other project that I enjoyed is the one made by group 29 (by Hughes). Link: https://eddevs.com/cs491/p1/
This project is made by only one person, but I think that it is the most beautiful among all the projects. And I appreciated the apk file on the website of the project which gives the opportunity to try the application directly without using Unity3D.
The requirements of this project are lighter than the previous one because the group is composed by only one person and the developer is an undergraduate. Anyway, he has done a great job with good design decisions.
The boxes and the cans scene are very particular, they are not at the center of their scenes, but the scenes are directly built over them. This is a great idea because in a table for breakfast there is no so much space to have a big scene around any object on the table, the way in which he creates these scenes doesn’t need a great space around them and so it solves the problem.
The two cereal box have nice scenes, a feature that I have appreciated is the text “goblin crunch” because all the scene is made over the box and moves with it, the text is always over it but is doesn’t rotate with the rest of the scene so it remains always in the right orientation in front of the user. The scene of the can is funny, the animation of the bubble tighter with the music are relaxing, the text that moves around the can make the scene less static. The last cereal box has the scene that is the most beautiful in my opinion: the characters moving on the drawer are funny and the music is perfect, the goblin (created by the developer on his own) is very good and the mix of the fluorescent colors with the dark theme is a wonderful contrast.
The placemat is interesting, it has a very simple scene on it but it has interesting changes when it is near another objects. If it is close to a can it simply shows over the can the nutritional information but when it is close to one of the two boxes (or both of them), besides showing nutritional information, the scene over it changes. The placemat has a light scene that is in contrast with the scenes over the two cereal boxes, so when one of the boxes is near to it then the scene over the placemat change to a night scene that is in theme with the one over the box.
The magazine is particular, differently from the other project that I’ve reviewed, it has the buttons over the pages but here is not a problem, the developer puts them in positions where they don’t cover anything, moreover they make a nice animation when clicked like they are actually clicked physically. The magazine is not a representation of a real book but every page has its own particular scene and some of them are very interesting (one in particular has a video on it!) and there are objects with animations for every scene.
The last thing that I have to mention about this beautiful project is the frame rate, it is always very high and the are no problems with anything while the application is running (like animations or audio clips that don’t stop when they have to).
(All the images in this pages are taken by me playing the applications (the links and the information about the developers are at the beginning of each review) on Unity3D. v

Ikea Place

In this week I’ve tried some AR applications on my phone that allow the user to try piece of furniture in their home / office using augmented reality.
For years there were applications that make possible to design our home in a virtual world, many people find this very funny and the play games like The Sims just to create their ideal house. Anyway, with these kinds of applications the user can’t see exactly how the final result will be in the real world. So, they are useful to make an idea of the general design of the house, but they are not good tools to choose piece of furniture to buy to decorate our home.
With the increase in computing power of the phones we are witnessing nowadays, we can have more sophisticated tools available for the goal of trying pieces of furniture in our home before buying them. These new applications use AR platforms and the camera of our smartphone to make possible to have a very good preview of how our room will appear with some new pieces of furniture that we wish to buy. This is achieved with some AR APIs (depending on the operating system of our phone) that allow the developers to easy put in the scene, captured with the camera, virtual models of the objects that they want to sell.
I’ve tried the IKEA Place, this app uses the new ARCore which is the new SDK developed by Google to build AR applications for Android devices and it has three fantastic features very useful in building AR contents: “Motion tracking (it allows the phone to understand and track its position relative to the world); Environmental understanding (it allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table); Light estimation (it allows the phone to estimate the environment's current lighting conditions)” (adapted from https://en.wikipedia.org/wiki/ARCore).
This app is incredible useful, thanks to the AR technology, it allows you to effectively understand how the furniture that you want to buy are going to change the design of your house and with no time loss because you can do it all by yourself with your phone at home. . At the end of this page you can find two images takes during the use of this application, the first one in a living room and the second one in a street (in this last case, the app had some problems in understanding the real measures in an open space like a street).
This AR technology is extremely powerful, and it opens many opportunities for the future. In a few years we could have many new educational applications which will use this technology to make the learning process more interesting and, above all, more effective. I can think to science books on astrophysics in school that, with some QR Codes on the pages, will trigger an application that will show the space and the planets of the solar system around the room in which the class is held. This can be already done with smartphones but with the future diffusion of this technology we will see many new AR devices that will more suitable to this task (like AR glasses). Another good application in the educational area that we will see for sure in the future is the use of platforms like ARCore (or the Apple’s version: ARKit) to create a more immersive experiences in museums. This will be amazing, just think of the film “Night at the Museum”: we can have a similar experience creating virtual characters that explain things and show them to visitors using AR devices or shows extinguished animals as if they are actually in front of the people in the museum (as I pointed out before, I hope in the diffusion of AR glasses to make these experiences more immersive).
The last example of application that I want to talk about is the best one. It is an application that can show clothes that you want to buy online directly on you in a mixed reality world, like you were wearing them for real. And, understanding in a good way the measures in the real world, the application would show the virtual clothes on you in a so precise way that you will be able to understand the right size for you. Clothes are the only thing that, nowadays, is still difficult to buy online without try it in person, but the future will be very different. And this is not only my hope, but there are many projects about an application of the AR technology like this one, for example Amazon has patented a mirror that dresses you in virtual clothes (here you can find an article about that: https://www.theverge.com/circuitbreaker/2018/1/3/16844300/amazon-patent-mirror-virtual-clothes-fashion). (Images taken by me with IkeaPlace app)

Different Scales in a Virtual World

(Images taken from https://stevensalvatore.com/2014/06/24/escapism-in-childrens-literature/) This week, during the designing of our virtual reality project, I tried to change the scale of the rooms and of the objects in them and play the game.
I scaled the objects in two ways: one time selecting all the objects (a part the camera view) and scale all of them at the same time and another incrementing the scale objects one by one. In the first case I had no big technical problems, but with this scale it was impossible to play the game but only view some rooms from a different perspective. One the other hand, scaling objects one by one gave me various problems (as you can see in the last screenshot at the end of this page), this operation changes all the relative positions, and nothing was aligned anymore.
Anyway, a view of a scene with a so different scale is very interesting. It’s very enjoyable to move in the space like a tiny ant and it gives you a completely different view of a scene. It’s like in the Marvel’s movie “Ant-man”. This experience reminds me one of the initial scenes of the famous movie “Alice in Wonderland”, the one in which Alice is in a room and she changes size different times eating a cake and drinking a potion. For example, the scaling tool of Unity can be very useful in the situation of creating a game for those two films.
The experience of moving in a familiar space but with this huge difference in scale is so weird, the movements is a space so big are insignificant and you cannot navigate in the space in a decent way. Sometimes it became impossible to reach some area of the scene, for example in one of the rooms of my project it is impossible to reach the sink from the floor (it is necessary to put the player next to the sink before and only after that starting the game).
Another problem that is impossible to overcame is the manipulation of objects. How can someone grab and use a torch that is ten times the player even in a virtual world? How can a player open an elevator’s door if the buttons are too high up? It’s like being a baby who wants to reach the cookies that are in a cabinet too high for him.
The capability of scaling models in the virtual world can be useful in a multitude of different situations.
The most practical one is when you are designing a 3D scene (e.g. for a game or for a scene that have to be animated to create a video clip) using a software development environment different from the one used to create the models or you are importing in your scene models created by someone else and the scale of the objects don’t match. Without the scaling tool you couldn’t use that model in your scene and you should recreate that 3D model again with the correct dimensions.
Moreover, there are situation in which we use computers and modelling tools to create virtual worlds that represent existing ones but with a completely different scale. We saw an example of this in the first lesson of the course during the demos, we saw a virtual reality application payed in the CAVE2 in which it was possible to observe a chemical reaction between two molecules like we were small as an atom. Or another to visualize the nervous system of our brain. This type of application of virtual reality that visualize what happens in our body with this type of view can be very useful to understand better the processes involved.
There is also another application that use the same principle of scaling the real objects to give you a different view of the real world (that we be impossible to have without computers): Google Earth. Google Earth and all the applications that model planets in the space are essential to have a general view of something so big and from a point of view that is impossible to reach for us. At this link http://stars.chromeexperiments.com/ you can find a web-application like these that use different scales to give to the user a more concrete idea of the dimension of our galaxy.
(Images taken by me with Unity3D running my second project)

Microsoft Holoportation

(Image from https://www.microsoft.com/en-us/research/project/holoportation-3/)
Holoportation is one of the most interesting projects that uses mixed reality that I have ever seen. It “is a new type of 3D capture technology that allows high-quality 3D models of people to be reconstructed, compressed and transmitted anywhere in the world in real time” [from https://www.microsoft.com/en-us/research/project/holoportation-3/]. This technology is developed by Microsoft that uses it combined with the Microsoft HoloLens. Microsoft HoloLens is a pair of smart glasses for mixes reality designed by Microsoft and it is the most successful of its genre. This device is perfectly suitable for Holoportation.
This technology opens many new possibilities. Actually, Holoportation is the 3D capture technology and thanks to this it is possible to record plenty of scene in 3D and save them to have perfect memories of some events, just think about some historical events such as important meetings between politicians or also simple things like a birthday party of a child that his parents want to record in a way which is better than a simple video.
But this technology expresses all its potential when it is combined with the Microsoft HoloLens. Combining this two thing Microsoft succeeded to create a way to communicate with other people that are in places distant from each other using the internet connection and holograms. This is made possible thanks to Holoportation which capture the two 3D views these views, each view is then transmitted (in real time using a very good internet connection) to the other person that will be able to see it using the mixed reality display in his Microsoft HoloLens.
This type of communication is something that people have always wished, just think to all the science-fiction movies where this is the type communication that is used. Thanks to the holograms it is possible to have really interesting and interactive type communications with someone that live far away from us. This technology can be also very useful for great company that have to arrange business meetings with other company in different countries, using this technology they will save lots of money used to pay the travels for the employees that have to go to those meetings because they will do that in remote using holograms.
This technology seems very optimized and it seems to work very well through the video I saw that Microsoft published online on its site. The device, the Microsoft HoloLens, itself is very powerful and it is able to create virtual object and map them in the real world in a very efficient way ensuring an excellent result (we proved it during the first week of the class).
I can find only two problems in this technology. The device is quite heavy and not so comfortable, and also to capture the 3D Models it requires to setup lots of cameras around the scene (so, it is no so intuitive to use). Moreover, these devices (Microsoft HoloLens and the cameras are still quite expensive).
Now the engineers are focused on trying to reduce the problem of the necessity of a very good connection to establish a good real time communication and they are using the as study case scenario the use of this technology in a car that is moving in the streets. This scenario requires to reduce the bandwidth required but also the requirements in terms of space required.
You can find all the videos regarding these technologies on: https://www.microsoft.com/en-us/research/project/holoportation-3/.

Comments on Student's Choices

Comment about "Teslasuit – Full-body VR Haptic Suit"
(from https://mrasto3.people.uic.edu/cs491/assignments/students-choice) (Image from https://mrasto3.people.uic.edu/cs491/assignments/students-choice)

Teslasuit is a new type of VR device that has the possibility to revolutionize the VR market.
Nowadays the most common devices used to capture the movements of the player are cameras that work with the headset and with a pair of controllers that the user have on his hands.
This kind of tracking system is not very accurate to track all the movements of the user, in particular in those applications in which the user has an avatar with a whole body. In this case the common devices used for virtual reality are not able to track correctly all the movements of the body and for this reason the movements of the avatar in the virtual world will be only approximations of the user’s movements.
Moreover, an important feature that is missing in the normal common setup of a VR station is the possibility to have some reaction in the real world that comes from what happen in the virtual application. So, the common devices don’t give to the VR application the possibility to send a feedback to the user in the real world.
These two deficiencies of the common devices used in for virtual reality are instead two of the main features of the new Teslasuit. This device is still not ready for the commercial use but it is already available for enterprises and game developments studios.
This suit integrates in a single device lots of interesting features: motion capture, haptic feedback, climate control and biometric system.
The motion capture system of the suit is composed of eleven motion capture modules that are used to provide an avatar system (the suit is able to completely track all the movements of the user’s body) and a complete gesture control system thanks to the on-board chip that processes the data received from the different modules.
The haptic feedback system makes possible for the VR application to send feedback to the user using the two systems provided by the suit which are the Transcutaneous Electrical Nerve Stimulation (TENS) and the Electrical Muscle Stimulation (EMS) (these technologies are already of common use in the medical field).
The climate control system is based on thermoelectric and it is a heat exchange technology that can create a more comfortable environment for the user while he is immersed in the virtual world. This technology can be also used as an active technology controlled directly by the VR application to give another different type of feedback to the user or, for example, to influence his brain in medical VR application used to study and help people with some VR therapies.
The last key feature of the suit is that it is able to collect lots of different biometric data from the user that include both physiological and behavioral data. This data are very useful in the medical field or in VR applications used for body training.
In conclusion this suit seems to be a revolutionary device with lots of new and old technologies put together to create an all-in-one VR device that have lots of useful application (from VR games to VR medical therapy). Now it is only necessary to wait for the commercial version of the suit to discover the price (that will be probably the most important thing that will influence the spread of this device) and how these functionalities work in real situation during the common use from the players.


Comment about "Teslasuit – Full-body VR Haptic Suit"
(from https://mmanto2.people.uic.edu/projects/ExpediaCenoteVR.html) (Image from https://store.steampowered.com/app/858380/Expedia_Cenote_VR/)

The primary goal of a virtual reality technology / application is to give to the people a good and immersive experience in virtual world. Many times, these virtual worlds are designed to be very different from the real one, because VR give us the opportunity to escape from the real world and to go in a fake one where everything is possible and to choose the one that is perfect for us.
Anyway, this is not the only possible case. There exists many applications of the VR technology where the virtual world represents the real one or, even better, it is a recorded version of actual places around our planet.
We saw some example of this kind of applications during classes, like VR applications designed to give the opportunity to see a museum or to explore some ruins in Egypt. Probably the most important VR application of this kind is the one made by Google which give the opportunity to the user to explore the world using its application Google Earth but in a more immersive experience thanks to a VR headset.
Expedia is one of the most famous travel technology company in the world, it has lots of website where people can book flights, hotels, cars and anything else that is necessary during a complete travel. It uses modern technologies to help people during the phase of arranging all the necessary things for a trip.
Now it has deployed and distributed a new service to help people to decide the place that they want to visit during their trips: Expedia Cenote VR.
Expedia Cenote VR is a virtual reality application distributed on the Steam Market, it provides an highly immersive cultural experience where the user can see cenotes that are found in the Yucatan Peninsula.
The experience is very immersive, the player can actually swim through the underground caves, and it not only a visual experience but the user can learn facts about that places and their history.
Moreover, this application is free, this because it is used by the company to advertise its services and to persuade people to book travels using its site. But everyone can download and play with it, it is not only an advertising medium, but it is a real VR application that people can use to visit a place that is probably new to them and that maybe they will never visit in person.
I think that applications of this kind are the best usage of the virtual reality technology. The desire to explore the world, to visit new place is something which is common to everyone and virtual reality can be used to give the possibility to see new places in a way that is much better than through common videos and pictures. Moreover, this can be the only possibility to visit some places for some people that have not another way to go there in the real world (for example, because of the cost of the trip or because of some medical problems).
(Image from https://mmanto2.people.uic.edu/projects/ExpediaCenoteVR.html)

GoogleTranslate & AR

Text

This is bold and this is strong. This is italic and this is emphasized. This is superscript text and this is subscript text. This is underlined and this is code: for (;;) { ... }. Finally, this is a link.


Heading Level 2

Heading Level 3

Heading Level 4

Heading Level 5
Heading Level 6

Blockquote

Fringilla nisl. Donec accumsan interdum nisi, quis tincidunt felis sagittis eget tempus euismod. Vestibulum ante ipsum primis in faucibus vestibulum. Blandit adipiscing eu felis iaculis volutpat ac adipiscing accumsan faucibus. Vestibulum ante ipsum primis in faucibus lorem ipsum dolor sit amet nullam adipiscing eu felis.

Preformatted

i = 0;

while (!deck.isInOrder()) {
    print 'Iteration ' + i;
    deck.shuffle();
    i++;
}

print 'It took ' + i + ' iterations to sort the deck.';

Lists

Unordered

  • Dolor pulvinar etiam.
  • Sagittis adipiscing.
  • Felis enim feugiat.

Alternate

  • Dolor pulvinar etiam.
  • Sagittis adipiscing.
  • Felis enim feugiat.

Ordered

  1. Dolor pulvinar etiam.
  2. Etiam vel felis viverra.
  3. Felis enim feugiat.
  4. Dolor pulvinar etiam.
  5. Etiam vel felis lorem.
  6. Felis enim et feugiat.

Icons

Actions

Table

Default

Name Description Price
Item One Ante turpis integer aliquet porttitor. 29.99
Item Two Vis ac commodo adipiscing arcu aliquet. 19.99
Item Three Morbi faucibus arcu accumsan lorem. 29.99
Item Four Vitae integer tempus condimentum. 19.99
Item Five Ante turpis integer aliquet porttitor. 29.99
100.00

Alternate

Name Description Price
Item One Ante turpis integer aliquet porttitor. 29.99
Item Two Vis ac commodo adipiscing arcu aliquet. 19.99
Item Three Morbi faucibus arcu accumsan lorem. 29.99
Item Four Vitae integer tempus condimentum. 19.99
Item Five Ante turpis integer aliquet porttitor. 29.99
100.00

Buttons

  • Disabled
  • Disabled

Form