Saturday, 23 March 2013

Sound Research

The principles of sound


Sound is produced when something vibrates against air molecules which pick up the vibration and pass it along as sound waves. So if there is no air there wont be sound, which mean there is no sound in space as there is no air. We hear sounds when these vibrations hit our ears as our eardrums are sensitive to sound pressures. Sound waves travel outward from the sounds source at about 1130 feet per second.

The functions of sound have many different principles. One of their main purpose is to express emotion in entrainment such as films and games. For example in a film with a sad scene sad music will be played to enforce sad emotions on the viewers. In interactive media you hear two types of sound, diegetic and non-diegetic. Diegetic sounds are sounds that would naturally be there, like the sound of cheering in a stadium or the sound of a car on the street. These sounds are used to add more realism to any form of interactive media. Non-diegetic sounds are sounds that would not naturally be there, for example music playing in the background or a narrator. These sounds are used to add more emotion depth. Both sounds are used often and for different purposes.

Audio cues are commonly used in interactive media as a sound design principle. These are sounds that people can recognize really quickly without wondering what that noise was. They play a sound when something happens on screen so the user is aware of it. For example when someone receives a message on facebook a short sound plays so you know you have a message. Another way sound is used in interactive media would be in the form of Acousmetre. Which is like the sound of a narrator for a video game or a movie. They speak throughout the game and movie to tell the story they are never seen. This sound is non-diegetic as it would not normally be there.

There are two types of sound used today analogue and digital and long going argument over which one is better.
Analogue sound is sound that has come directly from microphone recordings. Their information is stored in simple waves that can be fed through a speaker in its original form. In a tape recorder, for example, the microphone’s signals is taken and placed onto tape, which will get you the exact copy of the recorded sound.

Digital sound is digitised pre-recorded analogue sound waves that are converted and stored on the computer in the form of numerical information, which are processed by the computer to create and play the sound directly from the information.  This conversion is normally called sampling analogue sound. Look at music on a CD, the sampling rate is normally 44,000, meaning there is 44,000 numbers per second of music. To hear the music, the numbers are turned into voltage waves.  In conclusion, the higher the sampling rate, the better the quality of the sound, the bigger that file.

Both analogue and digital sounds have advantages and disadvantages. For advantages digital sounds never degrade over time like analogue and does not take up as much space as it can be compressed. Also digital does not have an annoying hissing sound in the background like analogue does. The advantages of using analogue sounds is that it will all ways be the best quality of sound, as there are no sampling rates.

Over all it depends on the preference of the person as some people prefer the hissing sound that analogue sounds have as the say it sounds more natural but I, my self find it irritating. So in my opinion digital sound is better as it takes up less space and lasts longer.


Monday, 18 March 2013

step by step


To start of the creation of my soundtoy I had to make the keyboard using cinema 4D.




To start I created a cube then resized it in to the shape of a mac keyboard.I also rounded of the edges to make it look better. 

Next Created a cube and resized it into the shape of a key on the keyboard and positioned in on the keyboard. Then I duplicated it to make the other keys. 


For some of the keys I had to resize them again to match certain keys on the keyboard. when I finished creating each key I checked to see if I had the correct amount of keys and that they were in the correct position.
When i finished that I then got a picture of a mac keyboard to use as a texture for each key.
To start I opened the picture in preview and selected the key I wanted and went on edit cut the edit paste in new.
I then saved it in a folder and done this for each key on the keyboard.
Next I created a new material and on the material open the key texture and then applied it to the correct key.
I done this for every key on the keyboard.

When I finished I rendered it then imported it in to unity. After importing it in to unity I encountered a problem and that was that the textures did not work and I would have to redo them all again in unity.




In unity I created a square room using four cubes and resizing them and positioning. Next I place a spotlight and the keyboard inside the room making sure the keyboard was in the middle and the brightness of the light was good. Then I put in a camera and used the mouse orbit script and made a few changes so it will only move the camera if you click and drag and I made it so the cameras centre is the keyboard.






Next I added my help icon, so I done some research on scripting buttons and found out how to make it so the text will appear if the button was clicked and disappear when clicked again.













After recording my sounds and editing them in audacity I then imported them in to unity. Next I made a new input key for each letter on the keyboard. Once i finished this I researched how to change a objects colour when key pressed and for a duration of time. I also researched how to play a sound when key pressed. After figuring out what code to put I then done a script for each key and sound after that my soundtoy was finished.

Sketches and flowchart

Using Lucid Chart I created a flowchart for my 3D soundtoy interface.


These are the buttons I used for my paper prototype and wire frame.
I used these buttons as they are very clear and visible and also because the are Visual cues meaning the people using my sound toy would not have to think about what each button does as hey would already know.

Mood board


Button designs


Final interface design


Interface Layout designs



Word search



Wednesday, 6 March 2013

wireframe

After completing my paper prototype i done a wire frame with a few changes from the results of my paper prototype. I removed the music note symbols for after the key was pressed as they would not be need. To test my wire frame I got 3 people to test it and screen recorded it. 



The results were pretty much the same as the results from the paper prototype once each user clicked on the help icon and read the text they easily understood how to use the interface. All 3 users were able to use the record and play back function successfully without any troubles.

After testing this wire frame I came to the decision to change my soundtoy a bit. I decided to remove the key animations and have them change colour when they were pressed and return to the default colour when the sound had stopped.

Paper prototype

For my soundtoy interface I made a paper prototype and recorded 3 people testing it. I done this to find out if the interface was successful. 





User 1 click on the help icon to start with but quickly understood how to use the interface after and did not take long deciding what to do next. User 2 was also able to quickly understand how to use the interface after pressing the help button even though they took longer on making decisions of what to do next than user 1 they were still easily ably to get the hang of the interface in a short amount of time. Also user 3 was able to quickly understand how to use the interface quickly without having to put to much thought in to it, so this shows that the interface for my soundtoy is not to complicated and does not make the user think about it to much.

Friday, 1 March 2013

Evaluation

Using unity i have created a working 3D soundtoy and with a interactive design interface. It is a basic white room with a apple mac keyboard in the middle of it, which I modelled in cinema 4D. the users press and letter on there own keyboard and the button on screen lights up and plays a sound. Each sound I recorded and then edited in Audacity before importing it into unity. I've learnt tons from the creation of my 3d soundtoy. For example, I now have an understanding of basic scripting and coding using JavaScript. During the time I spent solving a lot of coding problems I improved my problems solving skills finding new and quicker ways to solve complex problems. Also i learnt about interface design principles and now understand each one in more detail, which helped me when I was creating my interface.

I think the point of this unit was to give us an overall understanding of how interface design works. From this I learnt about interface design principles and how to apply them to my own interfaces, which will help with my work in other units. Also the scripting and coding I have  learnt during this unit will help me in the future whilst communicating with programmers.

Overall I am happy with the result and I now have a working and enjoyable soundtoy, even though there were a few things I could of done better, like with the way it looked. I could of made it more appealing by adding more colour or more detail, because at the moment I think it looks quite plane. Also I could of done a better job with the sound editing, so instead of having a slight delay on some of the sounds they start straight away. Next time i work on a project like this I will put more research into coding before hand so I don't spend as much time solving problems and ill also spend more time planning everything out.

Wednesday, 27 February 2013

Interface design research

Inter face design is found everywhere, whether yo notice it or not. A user interface is an interaction between two objects, specifically a human interacting with a digital object such as a computer. How good a interface design is depends on a range of different things, the designers personal preference for starters, some people prefer things that others might not so this could cause a problem as people might not like what the designer thinks is good. A good example of this would be the Mac os and window's os as some people prefer the macs os and others prefer the window's os.


Although the window's os was made 13 years ago its still one of the most popular os even though new os such as the mac are being made, this is because it is commonly updated so its does not get outdated. The window's and mac os are both similar in terms of popularity and there approach to usability is completely different. One of the best features of the window's os would be the start button in the bottom left, its also a visual cue so people understand what it is without having to think about it too much. From the start button, the user can browse all the programs and files on their computer. There is also a search bar at the bottom where they can instantly access what they are looking for by typing it or a keyword relating to it into the search bar. A similar part of the Mac os would be the spotlight feature, which is the macs version of a search bar. The spotlight is harder to identify as a search bar than the windows os version as it is a small icon in the top right corner of the screen with no description, this requires the user to figure it out how it works them self. Although it's more complicates than the windows version, once a user has got the hang of it, they can then uses it to find files and folders quicker and more accurate than the windows search bar. Overall both search bars help save a significantly large about of time finding files and opening software that's on the hard drive.  In the end both user interfaces are greatly designed and user friendly, so it all comes down to personal preference. In a review by "Adrian O`Connor" on why Mac os is better than windows he stated "Windows Search is a resource hog. A real resource hog. Spotlight doesn't consume 1.3GB of memory on my Mac, whereas Windows Search does on my workstation." He favoured the spotlight not just because it was faster, it was because it did not take up as much memory as the windows search bar did on his hard drive. From this I would conclude that people prefer the faster option and the one that takes up the least amount of memory on their hard drive.

visual cues are a great example of the metaphor principle in interface design. visual cues are a symbol used by designers to relay a message to the users with out using text and not using as much digital space. Also using a visual cue on buttons makes it easier for users to understand what the button does with out having to read instructions, this means they don't have to think about it. If the users does not have to waste time thinking about what each button does they will find the product more satisfying as it simple and easy to use and not time consuming. For example the eject visual cue is a good example as it is known world wide as the eject button is used on computers, DVD players and consoles so it is common knowledge what it does and no text is required.

An interesting interface commonly used in today's society is the PS3 dashboard and the Xbox 360 dashboard. Both interfaces are very different and unique, which has lead to the debate over which one is better. The interface for the Xbox 360 dashboard has change a few times over the years, changing the style and giving it a new look. this keeps the user interested and prevents the from getting bored, which is a very intelligent design principle. the way PS3 applied this design principle was by giving the user the freedom to change the theme and customize the icons. Using the PS3 web browser users are able to download thousands of different themes created by other users and apply them to there dashboard. Also the PS3 dashboard is less complicated than the Xbox 360's dashboard and easier to find stuff your looking for, on the other hand on the Xbox 360 the text is a lot easier to read. In my opinion the Xbox 360 dashboard is better as it has brighter colours and feels more friendly than the PS3 and the custom avatars add more life to it.

Computers are controlled by a mouse and keyboard. So the user uses the mouse and keyboard to interact and control the computer, the feedback from the mouse is the movement of the courser. The mouse has a mouse ball or a laser sensor on the bottom of the mouse to detect the movement and move the courser on screen. Before the laser sensor was introduced. The mouse would use a ball to track the movement, the mouse ball would work fine until the mouse got dirt in the bottom. Once dirt got in the mouse it will become in-precise and the courser on screen will be wobbling all over the place and become very irritating. This can happen with the laser sensor mouse, however, not very often. Even though the sensor mouse is more responsive than the ball mouse, the ball mouse is better for uneven and rough surfaces as the sensor is less efficient when not on flat surfaces or a mouse mat. The mouse has two buttons, left click, right click and a mouse wheel in the middle. The design of the mouse is just perfect as it comfortably fits in to the users hand and the buttons are easy to click as they are right under the middle and index finger not causing them to move to press the button, not causing any stress. The left button when clicked controls what is click by the cursor, so if the cursor is over a file and the left button is click it will open it. The right button will open up a sub menu when clicked for more options. For example instead of opening a file if you right click a sub menu will come up and you could then copy delete and so on. The mouse wheel button is used to scroll up and down on a page by simply using your finger to scroll it up and down. Also on some mice there are two buttons on the side which can be easily click with the users thumb. These two buttons are used as a back and forward button on webpages. These buttons save time as the user does not have to move the mouse over the back button screen as then can just do it with a simple movement of there thumb.

The touch screen is a human-computer interface that is controlled simply by the ability of touch. The feedback varies on each touch screen depending on what buttons are on the display. Touch screens can be used to create up to any number of buttons if programmed correctly since the screen can display anything we program it to. Even though touch screens are used on phones, iPod etc. they are not very popular on computers or consoles as people prefer mouse and keyboard or a controller. This is because of many reasons such as having to keep reaching across to touch the screen and eventually getting a arm ache where as with a mouse you can just relax it your chair and use it. David Pogue Commented "Part of the problem was that the targets—buttons, scroll bars and menus that were originally designed for a tiny arrow cursor—were too small for fat human fingers." I agree with him as I sometimes find some buttons on my phone a bit small so the even smaller icons on a computer would be even more annoying especially the ones in a group, this might cause you to press the wrong one as they are to close together. Also some times the touch screen is slow and unresponsive which can be irritating when your are trying to complete simple tasks. However touch screens can save time as you don't need to drag the mouse course to the location on screen, you can just press it directly and you can change pages easier sliding your finger across the screen, these are some of that aspects that make using a touch screen better than a mouse because they are simple and save time. Overall it depends on a persons personal preference for which they prefer.

Another interesting interface is the Nintendo 3DS menu. The 3DS menu is very easy to navigate as the user can use the touch screen to easily scroll left or right and click on any of the icons. Also if the icons are to small to read clearly the user can adjust there size making them smaller or bigger to their liking. This interface also uses a bunch of different visual cues along the top of the screen, this takes up less digital space and if there was lots of text it would look a mess and be confusing for the user. So using visual cues makes it easier and faster for the user to navigate through the menu, the faster it is the more appealing it is for the users. The 3DS interface is also the first handheld device that has 3D, so the user can see it all in 3D making it more interesting. There is also an option to turn 3D off as it can cause headaches if used for a long period of time and this will cause problems for the users if it could not be turned off. In my opinion the interface design for the 3DS is great as it is really easy to navigate and can be done very quickly, also everything on there is easy to read and the colours are warm and give of a happy vibe making people want to use it more.

In a Ted conference about 5 years ago, a university honours student introduced a prototype of a new type of windows interface. He took the working environment of a office, like post it notes and paper all over the desk and put it on the screen, digitally. He demonstrates how you can make piles of paper and hang them up or throw them around or at the piles of paper. He also shows that if you enlarge a file its physical attributes change. For example smaller files will no longer be able to -push  it around and it will easily knock down piles or push other files around. This style of interface is interesting as it feels like the user has more freedom as they can move there files around more freely and if people like to have a messy desk and use lots of post it notes this gives them the chance to do that digitally.

A great and very important design principle is the "don't make me think" rule. The purpose of this rule is to prevent people using the interface having to think about what to do next. Instead they should instantly know what button does what and how the interface works with out putting to much thought in to it. This design principle is commonly used n interface design world wide. A great example would be the basics of a phone and using it to call someone as the user instantly knows to put in the number and press the green button to call it. Still people grew up using phones so its common sense but with the touch screen, which is quite new. People knew how to use it straight away even if though it was there first time using it. This is because its touch screen so people knew to use there fingers and its only simple hand movements. For example users, with out thinking, knew to touch something in order to select it or drag there finger across the screen to change page. This is where the "don't make me think" rule was applied perfectly to the interface. "I should be able to "get it"-what it is and how to use it-without expending any effort thinking about it." This was taken from Steve Krug's book don't make me think.

Consistency is another important design principle, which I shall be applying to my 3D soundtoy. This is so the people using my soundtoy would not have to keep learning each set of buttons, which would be frustrating. Using consistency users would be able to learn how to use my soundtoy quicker, so when the layout changes the user will not be stuck and wondering what to do next. As the same principles of interaction will be applied as its the same application. This principle works for tons of other devices and applications  For example, video game consoles such as the Xbox 360. No matter the game, the users always knows that the A and the B button are the main buttons to navigate the menu. or with a shooting game, the user always knows that RT is the button you need to press to shoot. "That’s what consistency in design is about. Get the routine, navigational stuff consistent so your audience can settle in, then you can have fun with the real stuff." This is quoted from and article by John Mcwade on design talk. I totally agree with this quote, by applying this principle to my 3D soundtoy I hope to make it easier to use and more enjoyable.