the parameter estimation of the prototype scanner is going on vigorously; there are many constraints in it which needs to considered. Hardware having typical behavior must be used and their operation must be stretched to their maximum.
My project mentor is a knowledge person, whose expertise and experience is vast. I'm very lucky to be guided by him.
We are currently discussing the problems associated with the scanner and soon we'll come with a solution.
about my work - my project on virtual reality and its implementation
Total Pageviews
Saturday, July 23, 2011
Friday, July 22, 2011
Getting my first experimental data
Successfully started off with my data collection for prototype design. My project mentor will be providing me with some data today, using which i'm going to start the parameter design and estimating the limitations.
Once i get the data, the next step is to calculate the working parameters of the scanner of the prototype.
Once i get the data, the next step is to calculate the working parameters of the scanner of the prototype.
Thursday, July 21, 2011
Kumar Lanka's comment - for all to read
Kumar lanka said
Very impressive. I'll keep coming back to this blog from time to time.
Once the project is live, is it some thing like, users will have some streaming enabled in their mobile phone and transmit that to the VRD via bluetooth?
My reply to his comment is
It is a no and a yes!
because once the project comes alive; yes, all available electronic gadgets can be integrated into it. but there is one more interesting point to note. a mobile phone has got its shape, mainly because of its screen, so that users can read the content conveniently. but when i integrate a mobile phone to a VRD, i don't need a screen. so the processor and transmitter can be kept somewhere in your backpack and transmit the data to the VRD (either via Bluetooth or any other technology). The same thing applies to all other gadgets. In fact, a single processing unit can be used for all the gadgets, since they all share the common screen i.e VRD.
Very impressive. I'll keep coming back to this blog from time to time.
Once the project is live, is it some thing like, users will have some streaming enabled in their mobile phone and transmit that to the VRD via bluetooth?
My reply to his comment is
It is a no and a yes!
because once the project comes alive; yes, all available electronic gadgets can be integrated into it. but there is one more interesting point to note. a mobile phone has got its shape, mainly because of its screen, so that users can read the content conveniently. but when i integrate a mobile phone to a VRD, i don't need a screen. so the processor and transmitter can be kept somewhere in your backpack and transmit the data to the VRD (either via Bluetooth or any other technology). The same thing applies to all other gadgets. In fact, a single processing unit can be used for all the gadgets, since they all share the common screen i.e VRD.
Wednesday, July 20, 2011
Gave a presentation
My presentation about my project to professors in my university went well. they seemed to like my work. and they are providing me with workspace and guidance. I'm very grateful to them for encouraging me and inspiring me. Thank you very much.
Now that, a base has been laid, i need to utilize this opportunity well and work hard for the project.
I'm now calculating the ratings of the stepper motor and a dc motor that i require for my prototype. Tomorrow evening, i'm planning to buy them.
Now that, a base has been laid, i need to utilize this opportunity well and work hard for the project.
I'm now calculating the ratings of the stepper motor and a dc motor that i require for my prototype. Tomorrow evening, i'm planning to buy them.
various labs in virtual reality research
i have recently found that the HIT Lab at the University of Washington, Seattle is not the only university having a research lab dedicated to virtual reality research. There are many other research labs.
1. Standford University
2. UC Berkely
3. University at Buffalo
4. University of Houston
5. Massachuttes institute of technology
6. Macquarie University
7. Norwegian university of science and technology
8. The Max Planck Institute for Biological Cybernetics
so, virtual reality is not something just out of fiction books, but been under serious research in various prestigious research institutions
1. Standford University
2. UC Berkely
3. University at Buffalo
4. University of Houston
5. Massachuttes institute of technology
6. Macquarie University
7. Norwegian university of science and technology
8. The Max Planck Institute for Biological Cybernetics
so, virtual reality is not something just out of fiction books, but been under serious research in various prestigious research institutions
Prepared an execution plan
due to unavailability of internet connection, i couldn't blog for three days.
My progress has been good. I have created an execution plan that clearly defines all the work that needs to be done in my project and i have organized it in a structured manner.
I have 4 stages in my project.
1. Making the prototype
2. Making the VRD device
3. Programming the VRD device
4. Further improvements
each stage will consist of phases and each phase will consist modules.
After organizing my work in the above mentioned structure, i have prepared a brief explanation for the plan.
Now, i have a presentation scheduled tomorrow with a professor.
My progress has been good. I have created an execution plan that clearly defines all the work that needs to be done in my project and i have organized it in a structured manner.
I have 4 stages in my project.
1. Making the prototype
2. Making the VRD device
3. Programming the VRD device
4. Further improvements
each stage will consist of phases and each phase will consist modules.
After organizing my work in the above mentioned structure, i have prepared a brief explanation for the plan.
Now, i have a presentation scheduled tomorrow with a professor.
Friday, July 15, 2011
a little progress with my prototype
just now, i finished the hardware design of the prototype of vrd, without including the data input hardware to the microcontroller. That involves a bit of image processing. so after deciding the algorithm for processing the input image, i must design the input hardware. by then, i;m planning to construct this hardware that i have designed today...
Thursday, July 14, 2011
Reply to Aaron - For all to read
Aaron had asked this question long back in the comment and i had answered him, but i'm putting this as a post so that all can read.
Q: Hey man, it sounds really innovative.
Will the user have to sit still!! during the projection onto the retina. Since the pupil is only 3mm at the most how will it work? And the retina is very susceptible to damage as it is a transparent neural layer. Are you anticipating any retinal complication in the long or short term use of your technology?
My reply to him was this...
A: many research studies have been done on this issue, and it has been found that the eye need not be still. As we ll increase the exit pupil of the focusing lens, the image can be viewed by a moving eye ball. In fact, the human eye ball as u might know, can never stay stand still for a long time. And I'm planning on implementing eye tracking technology ( which has been developed by a German research lab).
Moreover, there wont be any damage to the retina or the eye, since the energy of the light source that is incident onto the retina is moved in a raster scan. Bcoz of that, the average energy incident over a point on the retina is very low, compared to the energy emitted by the light source, which itself will be in milliwatts.
but, yes.. there are other complications pertaining to the eye that will have to be seriously considered. For example, in a partially immersive system, if a high clarity image is conjured in one part of your field of vision and the other part sees the ambient surroundings, then the eye tends to concentrate only on the image that has higher clarity and blurring the other.
the physical interpretation of what i told above is, if u were to wear a vrd device, there is a risk that u tend to ignore your surroundings, which is dangerous in a dynamic system (eg. u walk on the road, seeing 'transformers' movie on your vrd device, but tend to ignore the truck speeding right towards you, might be dangerous! )
but such problems can be solved, and have been solved. bcoz Microvision Corp. has manufactured an augmented reality eyewear for the employees of a popular automobile industry, wherein the engineers were able to refer to the CAD drawings of the engine simultaneously while working on the engine.
so, constructing the device is nothing innovative on my part. It has already been done successfully and developed into a product. My work comes in designing a complete system ( say, program or algorithm) that generates a perfectly immersive, partially immersive and augmented vision in a single device)
Q: Hey man, it sounds really innovative.
Will the user have to sit still!! during the projection onto the retina. Since the pupil is only 3mm at the most how will it work? And the retina is very susceptible to damage as it is a transparent neural layer. Are you anticipating any retinal complication in the long or short term use of your technology?
My reply to him was this...
A: many research studies have been done on this issue, and it has been found that the eye need not be still. As we ll increase the exit pupil of the focusing lens, the image can be viewed by a moving eye ball. In fact, the human eye ball as u might know, can never stay stand still for a long time. And I'm planning on implementing eye tracking technology ( which has been developed by a German research lab).
Moreover, there wont be any damage to the retina or the eye, since the energy of the light source that is incident onto the retina is moved in a raster scan. Bcoz of that, the average energy incident over a point on the retina is very low, compared to the energy emitted by the light source, which itself will be in milliwatts.
but, yes.. there are other complications pertaining to the eye that will have to be seriously considered. For example, in a partially immersive system, if a high clarity image is conjured in one part of your field of vision and the other part sees the ambient surroundings, then the eye tends to concentrate only on the image that has higher clarity and blurring the other.
the physical interpretation of what i told above is, if u were to wear a vrd device, there is a risk that u tend to ignore your surroundings, which is dangerous in a dynamic system (eg. u walk on the road, seeing 'transformers' movie on your vrd device, but tend to ignore the truck speeding right towards you, might be dangerous! )
but such problems can be solved, and have been solved. bcoz Microvision Corp. has manufactured an augmented reality eyewear for the employees of a popular automobile industry, wherein the engineers were able to refer to the CAD drawings of the engine simultaneously while working on the engine.
so, constructing the device is nothing innovative on my part. It has already been done successfully and developed into a product. My work comes in designing a complete system ( say, program or algorithm) that generates a perfectly immersive, partially immersive and augmented vision in a single device)
Saturday, July 9, 2011
Friday, July 8, 2011
Starting the prototype designing
I'm starting with the designing of prototype for concept illustration.
still no success with stepper motor. none of my friends seem to b having it. this needs to b looked into. may b,i gotta buy one.. then i also need a microcontroller for programming my stepper motor to run it in raster scan fashion...
still no success with stepper motor. none of my friends seem to b having it. this needs to b looked into. may b,i gotta buy one.. then i also need a microcontroller for programming my stepper motor to run it in raster scan fashion...
to illustrate my concept, i'm planning to construct a prototype. for this purpose, i'll be using a GaAS light (laser pointer, that is available in the market) as a light source and a white plastic card as the screen and two mirrors attached each to a normal motor and a stepper motor, which acts as the raster scanner.
for this, i need to connect the normal motor and the stepper motor to a microcontroller and direct the light in a raster scan fashion..
for this, i need to connect the normal motor and the stepper motor to a microcontroller and direct the light in a raster scan fashion..
Thursday, July 7, 2011
what's my project basically about......
i'm working on vrd (virtual retinal display) systems.. In this, an image is raster scanned onto the retina of the eye. as simple as it is.. In a tv set, there is an electron beam that is being directed in both horizontal and vertical axes with the help of electric field scanners in both directions and then the directed beam strikes the screen. Similar to this concept is the virtual retinal displays. Just replace the electron beam with a miniature low power, low intensity laser, the electric field scanner is replaced with a MEMS(Micro Electro Mechanical Scanner) and the screen is simply replaced with the retina of the human...
i hope, now u understand the complexity of the project...
now, this so happens that this technology had been invented and much researched upon at the human interface technology(HIT) lab at the University of Washington, Seattle and manufactured as a marketable unit by Microvision Corp.
my aim is to develop a workable, feasible, implementable workspace environment that has three modes of operation viz. fully immersive, partially immersive and augmented reality. And then extend the technology to have user - interaction with the image..
This all translates to having a virtual screen suspended in thin air in front of the user and visible only to the user, in which a computer, a phone, a mp3 player, and all sorts of electronic gadgets can be integrated into one hardware.
The future of my project can be the integration of cloud computing along with my project so that display devices are eliminated and processing power can be reduced manifold.
finally, i may be very bad at explaining, but still ... if u have any doubts.. ask....!
i hope, now u understand the complexity of the project...
now, this so happens that this technology had been invented and much researched upon at the human interface technology(HIT) lab at the University of Washington, Seattle and manufactured as a marketable unit by Microvision Corp.
my aim is to develop a workable, feasible, implementable workspace environment that has three modes of operation viz. fully immersive, partially immersive and augmented reality. And then extend the technology to have user - interaction with the image..
This all translates to having a virtual screen suspended in thin air in front of the user and visible only to the user, in which a computer, a phone, a mp3 player, and all sorts of electronic gadgets can be integrated into one hardware.
The future of my project can be the integration of cloud computing along with my project so that display devices are eliminated and processing power can be reduced manifold.
finally, i may be very bad at explaining, but still ... if u have any doubts.. ask....!
read about bionic vision yesterday.. it is much closer to my aim i.e to have a fully submersive, partially submersive and augmented reality vision... basically bionic vision is acheived by building the electronic circuitry and place miniature LEDs on the surface of a contact lens. it is a highly miniature device that is devised using nano fabrication techniques like self - assembly.
it is a sophisticated device alright, but first i would proceed with my vrd device.. but i know someday, in course f my project i'll have to work with bionic vision.
this new concept is very exciting..!
it is a sophisticated device alright, but first i would proceed with my vrd device.. but i know someday, in course f my project i'll have to work with bionic vision.
this new concept is very exciting..!
Wednesday, July 6, 2011
Subscribe to:
Posts (Atom)