On April 22nd 2015 we were able to take our system for a preliminary test at the Get Up to Speed event in Sheffield. This was an incredibly productive day that allowed us to gather some user feedback about the system which we will use to make some improvements. Overall it seemed to me that users enjoyed their interaction with the system and were interested in where the project would be in the future. Hopefully the response to our final test will be as good as what we received at this event.
One of the issues that I have been encountering while developing the controls for the system is replicating fine movements, this being the rotation of the hand which can be used to move the tools or the deformable object. This is due to the way the Kinect works, it can detect individuals within its field of view but cannot detect small movements such as the rotation of the wrist. Therefore I have decided to add an additional method of control to the system, the myo a gyroscopic armband that allows for both gesture and fine motor control.
With the myo I can now track the rotational movement of the system and use a number of unique gestures to perform actions in the system. I will outline these unique gestures and how I will use them in a future post.
Based on feedback received from individuals trying out the system I opted to add a “window”, this was because some individuals thought the environment felt quite small leading to a claustrophobic fleeing. My first attempt at making a “window” was to simply add an image to the front wall of the forge. However as you can see from the provided image this isn’t particularly realistic and looks more like a photo then a view to an outside world. As such I opted to create a perspective based window. To do this I remade the model for the forge adding a small opening to act as window and then placed the same image in a perspective state.
Ideally I would like to visit the forge again and obtain a panoramic shot of the forge and use that to add “depth” to my system.
One of the key selling points of our system is the ability to 3D print a memento for each individual user based on what they created. To do this the system is required to change the model created by the user into a unique format known as .stl. This file format is unique and not commonly exported by non 3D modelling software. Thankfully the Unity asset store has a “plugin” that can be purchased to allow for .stl exportation.
However this plugin can only be accessed within the Unity editor so I’ve had to make some changes to allow for exporting while the system is running. It is also important to consider how the exportation will function without breaking user immersion; thankfully I have already considered this. In my previous post about the myo I mentioned that it allows for unique gesture control. With this in mind I developed a “Canvas” based menu system that can be accessed and navigated using the myo. This menu system allows users to export their created object to a .stl format allowing it to be printed by our 3D printer.
On 9th March 2014 the University was visited by Jason Bradbury, who is set to become a guest lecturer at the University. Before his inaugural lecture Jason came round and visited a number of students working on projects in the University. One of the projects he came to look at was mine, unfortunately he was in a bit of a rush so I didn’t get the opportunity to ask him any questions. I did however manage to get a picture of him with the system.
If you would like to follow Jason on twitter he can be found here: https://twitter.com/jasonbradbury