Archive for the ‘Gaming’ category

Virtual Reaction Game — developing brave new worlds

July 2, 2015

You are suddenly riding around Stanford in an Aston Martin convertible.  3D text appears in front of your car with game instructions and a countdown timer.  Ringing bombs spawn all around you; you must locate and defuse them!  If you run out time, you should restart the level… or continue playing amongst colorful exploding bombs which will certainly induce nausea.

This truly describes the premise and experience of Virtual Reaction, an immersive Virtual Reality (VR) game.  My team created this interactive training game to improve a user’s sensory processing and reaction times.  We hypothesize that Virtual Reaction maximizes users’ stimuli and engagement which results in better training and more fun.

arcade

Explaining Virtual Reaction to my friend at the Synapse Arcade. He was the first of only two people to beat the game’s hard level!

New Skills:

  • Unity 5 Game Engine.  Javascript/C# scripts
  • Developed VR environment with Oculus Rift DK2 featuring live-action footage

Next Questions:  During the game’s development, I encountered 4 salient problems that I think epitomizes VR’s biggest challenges:

  1. Lagging Frame Rate:  Our game needed to be more efficient with its computation.  There was noticeable lag when changing your view.
  2. Un-immersive Sound: multi-sensory integration is essential to make our surroundings believable.  Our game featured background music, driving/car/wind sounds, and ringing bombs in 3d space.  When mixed together, the sound wasn’t convincing.
  3. Unreal interactive experience:  without Oculus Touch, input was constrained to computer keys.  Typing does not translate into VR well so users may feel disoriented trying to interact with the game.
  4. VR sickness: the 3 previous issues and several other  factors caused discomfort for users.  Being in VR is always fun, however during its development I required frequent breaks in order to rest my eyes and keep my dinner inside me.

Key Takeaway:  VR will disrupt, enhance, and redefine how we consume media.  Applications are everywhere: gaming, cinema, training, research experiments, surgery, defense, communication, and a lot more.  Right now, it’s a young evolving platform that’s experiencing significant growing pains.  But I believe it’s only a matter of time before people overcome these challenges and unlock VR’s potential.

Screen Shot 2015-07-02 at 2.53.15 AM

Our poster explains the basis of the VR game as a neuroplastic training environment.

VisualTouch Music Game — blending HW & SW

July 1, 2015

Embedded systems are everywhere.  A digital watch, microwave oven, electronic toy, and your car dashboard are a few disparate examples.  The characteristics of an embedded system are:

  • Made from a combination of hardware and software
  • Task specific. They’re designed to perform one specialized task as opposed to general purpose computers
  • Usually reactive or feedback-oriented.  E.g. most home or kitchen appliances only function when you interact with them.
  • Built with efficiency and frugality.  Your toaster needs a timer and heat control, but it really doesn’t need a full-fledged CPU
  • Should be reliable and stable.  Because they are simple, we expect these machines to work. They shouldn’t require maintenance because it’s often easier and cheaper to buy a new one.

For my required EE Hardware design project, my team developed an embedded system on an Atlys Spartan-6 Development Board.  This board constructs your hardware specification, like an FPGA, but then it can also run C software that you place on top of the hardware.

StanleyFPGA

My brother’s Flat Stanley visited me while I was working in the EE lab.  Sadly, I couldn’t find a photo of our finished project, but this is our Spartan-6 Development Board

We hooked up a webcam, a screen, and speakers to our Board.  We had the screen display the camera’s output.  Our game consisted of these playable modes:

  1. Changes the chord’s duration based on the red/blue/green content of the overall pixel count.
  2. Changes the chord based the amount of movement detected by the camera.
  3. A grid is overlaid on the screen. When a user in front of the camera covers a square of the grid with their hand, the board plays a song.

Skills:

  •  Verilog code in Xilinx Platform Studio configures the processor-based system.  [The hardware platform managed the transfer of data from the camera to the screen.  All the song data was also implemented in hardware.]
  •  C programming in Xilinx Software Development Kit.  [Everything else:  the music player, game controls, game logic, pixel computations were done in software.]

Key Takeaway:    This project was a turning point: creating the hardware infrastructure was important, but the software was the crux of the game! My specialty was more on the hardware side and I realized that I shouldn’t leave college without delving into the software domain.

Screen Shot 2015-07-01 at 8.00.14 PM

DJ mode allows user to mash these pop songs together based on where one’s hand is positioned in the screen’s grid.