Adam Henriksson is a designer with a background in Industrial- and Interaction Design at Umeå Institute of Design. He has experience working on multidisciplinary collaborations and self directed efforts within academic and commercial settings.
His work stretches across a broad spectrum of interactive products, interfaces and experiences with a consistent interest centered around the relationship between people and technology.
The Web has experienced an exponential growth since its commercialisation in the 90’s. Gradually the search engine has become an integral part of seeking information and navigating this seemingly immaterial space. The mutual transaction of data between individual and search engine is undeniably a prime example of human-computer interaction, as both take an active role. Yet, search engines signal a sense of frictionless efficiency and there is much to be explored in the way people perceive, interact and relate to them.
The project proposes an approach to designing search features grounded in user experience rather than efficiency - which not only gratifies the need for information, but supports a diversity of journeys. The result, Exposeek, is an experiential prototype supporting exploratory browsing based on principles of scalable infrastructure, transparent computation and serendipitous information. Suggestive queries, legible algorithms and augmented data provide additional insights and present an alternative way to seek and peruse the Web.
The space designed to offer the user control over their surroundings in order to alleviate stress and promote mindfulness. It becomes responsive to the presence of people, generating intimate spaces that support relaxation.
The position of one or multiple users trigger moving partitions, light and a generative soundscape to allow for an explorative configuration of the space.
The prototype was built in Philips' Experience Lab and later tested with a diverse group of experts within the fields of architecture, interaction design, biomimicry and mental wellness.
Full scale prototype built in the Experience Lab.
Banks are experiencing a revolution. Not too long ago customers had to go to a local office to deal all financial tasks. Today, all active type of communication with the bank is initiated by the customer, leaving full responsibility with little social contact with a professional. It creates a “service provider - client setup” where the customer is there for the bank instead of the other way around. For some, dealing with financial issues does not come naturally leading to a mis- or nonuse of banking services.
Common passive communications are monthly, quarterly or yearly status reports. It is a standard model with limited possibility for the customer to influence the frequency and the content. Is there a way to support the individual to actively set up a service where events are seamless, appropriate and contains desired content?
OI! is a collection of connected artefacts abstracting data from personal bank accounts and thus supporting individuals to have a better understanding of their current financial status.
Each artefact uses a unique output running recipes set up and customized by its owner. These recipes allow both binary alarms and reminders and quantitative data as inputs. These spimes can live in semi-public space due to the personalized encryption interpreted by its owner, author, director.
Prototypes in context:
"If daily purchases surpasses € 50 — Release the scent."
"As final payment comes closer — Swing the arm from left to right with higher frequency."
"As account balance is declining — Drop the line.
The Elevated Coffee Machine is an experiment in social behavior around a coffee machine. It is an every day object used for one thing, and one thing only, serve coffee. This day it got the added behavior of playing elevator music after someone fills up their coffee. The behavior is added to evoke an emotion.
The experiment was prototyped with a computer placed inside the coffee machine, which analysed the sound level. Once the machine starts to pour the coffee it randomly picks a song and plays for about a minute. Two cameras was hidden to capture the reaction of the thirsty test subject. The experiment was developed and tested in a day together with other sounds like a dialup modem, spaceship and whispering voices.
Most Android applications are written in Java, a strict but portable object-oriented programming language. The language and its development environments, such as Eclipse, are widely popular but takes time and considerable effort to get comfortable with.
Processing, the open source programming language and integrated development environment, has become one of the fastest ways to sketch ideas through code. The language buils on Java, but uses simplified syntax and functions. Like Java, it is platform indepentent and runs just as easily on the Android mobile platform. Together with the the free Android software development kit it allows anyone to push an application to an emulator or a device. While this is not a walk in the park, due of issues with compatibility, it works well when set up.
Touch / Shapes is a breif exploration made while collecting information and writing a tutorial for downloading, installing, troubleshooting and using Processing for Android. It showcases some fundamental functions used for building a simple multitouch application.
The first example is the most basic multitouch application enabling up to ten touch events. Each touch event is display with a different coloured ellipse and the size matching the pressure.
The second example uses two to five touch events to draw geometric shapes. The shape, size and orientation is depending on the amount of fingers placed on the device.
Consumer products are able to exchange data successfully over standard protocols. The communication is supported by predefined rules that allows the sender to provoke a response of the receiver in a synchronous way.
'One Bit' explores communication between two devices using an ethernet cable and maximum specification of one bit. The sensing device has a jack plug input and analyses sound through 7 bands. The actuating device reads the signal and powers a small DC motor and creates a varying airstream allowing a lightweight plastic ball travel up and down to the rhythm of the music.
The prototype demonstrates that a qualities of a complex input, such as music, can be communicated over a simple protocol to be expressed tangible output.
The pendulum, often used in accurate timekeeping technology, is a mass suspended from a pivoting point. Its hypnotising motion may look simple, but the actual mathematics behind it is rather complicated. Stacking multiple pendula creates a dynamic system with strong sensitivity to initial changes.
The Random Motion Machine consists of a pivot placed on a rail with linear motion. The pivot is facing upwards allowing each pendula a free range of motion and the various combinations of speed, travel intervals and dimensions of pendula creates a different set of behaviour.
Its current condition seem to act in random manner, but by visualising its behaviour over time one can start to see patterns and understand its capabilities and restraints.
A slight movement from one person needs to be reciprocated and adjusted by the other two, which leads to interesting unspoken dynamics between the participating audience. It is an exploration of live video aesthetics and social dynamics.
Unlike conventional design processes there was no ideation or concept development phase. The initial task was to set up multiple video feeds to later be able to use creative coding as a tool and process to explore cinematographic processes, such as slit-scan, and develop new techniques based on the knowledge gained from each of the experiments.
The live video is being processed through three cameras and later deconstructed, randomly combined and reconstructed into a single feed. Participants were able to document static frames by creating louds sounds such as a clap.
Prototyping physical experiences requires knowledge with both software and hardware. Each component have its own capabilities and restraints. Combining the two can be proven difficult without technological proficiency.
The project, originally with the working title OpenMove, is an initiative born out of curiosity to simplify the process of prototyping with input devices for designers with little to no experience with programming.
Kaliber is a stand alone middleware, communicating through the standard OSC protocol and can therefore be used together with different programming languages. The application can be run in parallel to any software sketch and communication can be muted and parameters can be redirected or modified, which are supporting features for 'live prototying'.
Kaliber supports a wide variety of input devices. The JInput API is the main 'Human Interface Device' library and enables hardware such as 3Dconnexion Spacenavigator, Playstation Dualshock 3 Sixaxis, Xbox 360 Gamepads for PC, Dance Dance pads, Arduino Leonardos, keyboards and mouse-like devices. The support for Playstation Move relies on Thomas Perl's PS Move API to receive data from the motion controller and output haptic events and colour of its LED.
The application was released for a week of tutoring "Experience Prototyping - Game controllers and Input Devices" with the Interaction Design Program at Umeå Institute of Design. The participants gained skills and experiences building prototypes of objects, installations and games using Kaliber.
Four players navigate a jungle of 20 hanging PlayStation Move controllers. Each player is assigned a colour. Players need to hold controllers of their colour by pressing a button. As long as you have two controllers pressed, you are safe. If no controllers of your colour are pressed, you are eliminated instantly. If only one controller of your colour is pressed, you begin to lose life energy quickly. Grab a second controller, quickly, before all your energy drains and you are eliminated. Colours move around the jungle and the pace speeds up as the game progresses. "Tarzan" around the space, and use your body to block the others' paths. Awkward body contact encouraged.
This competitive PlayStation Move game was conceived and developed in 48 hours during the Nordic Game Jam 2012 in Copenhagen, Denmark. Since then it has been exhibited at Spilbar 14, Come Out & Play NYC and Indiecade.
Deaf culture is as rich as many other sub-cultures and far more than only being defined by an impairment. Sign Language has been developed into a fully accepted language worldwide and there is theatre, film, poetry and even music, all enabled by it.
Ethnographic studies became the foundation understand how people take part in these activities and how social meetings are organized. Focussing on parents to a deaf child, the aim was to trace how and where they get access to all the information about their situation and find the contacts that they need.
The result is a service which functions as a hub for people concerned with the topic of deafness or hardness-of-hearing. The service is build upon social local networks, which facilitate information sharing within Deaf communities. It supports the existing, rich culture with tools which connects hearing, deaf and hard-of-hearing people on a local level. By providing a web- and a mobile app, it makes it easier for people to get in contact and take first steps in this diverse world.
The term, 'Planet Eyeth', is used in deaf communities to describe their version of 'Planet Earth'. The service uses a system of hashtags to determine what the posted content is related to or handling with. The users can decide which topics they are interested in and may follow the corresponding hashtags to read updates about it in their feed. As the platform does not use friend-requests or groups, nobody who is new to the platform is excluded. PlanetEyeth focuses strongly on a local aspect to support personal meetings. By letting the users know what is going on around them, the service helps with finding peers in their town or area.
With the dissemination of the internet, distance communication nowadays plays a big role for the Deaf community and enables a global exchange. PlanetEyeth allows the Deaf community to communicate in their native Sign-Languages. Short video-clips can be recorded and posted in combination with or instead of written sentences. This gives people who want or need to learn Sign-Language access to a big number of native users and learners at the same time.
Edited material from documentary "Through Deaf Eyes".
Windows phone application.
A game start off by all players standing in a circle, each taking turns on striking their opponents. A strike must be a swiping movement ending in a freezing pose. Other players are only allowed to dodge an attack by moving their arms, also ending in a freezing pose. Once hit, the player is out. The only player left is the winner.
In this digital prototype each player wears a set of gloves containing motion controllers. The sensors acts as impartial judges allowing for a fair game. The game's turned based rule still applies, but instead of a logical rotation a random player is selected and receives haptic feedback through the gloves. This way, no player can anticipate who is going to make the next move. The game allows the striker one single move, while the other can slowly dodge the attacks.
Johann Sebastian Bach's Brandenburg Concertos are widely regarded as some of the best orchestral compositions of the Baroque era.
This typographic exploration visualizes sound and music on a vinyl record sleeve. It is an exercise of minimalist esthetics with restrictions to use only typography, the Univers font, and a composition of eight horizontal lines placed in a grid. The idea behind the sleeve was to frame the text creating weight and shape as an effect of negative space.
Today we get introduced to playing with friends and family at a young age. Playground games or "folk games" can be played for competition or entertainment, but it is the social engagement that sets them apart from a majority of computer games today. However, terms like 'social gaming' or 'gamification' tends to be widely used by developers for both mobile and web based platform. The common aspects are to share, compare and impress your peers, but the feedback and recogniction can be classified as rather limited or shallow 'stroke'.
Game mechanics can create immersive experiences, raise questions and gives people incentives for actions. Players get put into a role where they can safely explore tasks being given. As "King of the Hill" one is allowed to move someone down a slope, while such behavior is unacceptable in any another context. Time and complexity are two relevant factor in gameplay. Many new 'social games' are long or endless, but history shows that some of the more long lasting games has been short and spontaneous. Rock-paper-scissors is a prime example of a short and 'portable game' played for countless of reasons. Social contexts are complex, but carries influencial factors for how a game is played. Rules can change drastically dependent on location, occation and the people engaging in the activity. This way, one can also look at multiplayer games as social experiments.
Pockit is a portable motion controller. It measures parameters like motion and location, and gives feedback through vibration, sound and visual cues. The controller is an extension of the player thus enabling play everywhere. A hive mind community can log locations and create hot spots, much like gaming arcades in the 80s.
The controller shares qualities with today's couch friendly controllers, while still being portable as mobile products carried in pockets or bags today. What sets it apart is the resistant rubber-like 'skin' wrapping around the components. The same material forms a dynamic joint, containing two bend sensors, letting the player navigate a text based interface with a slight twist. When turned off, the controller can be folded to fit the pocket. Inputs are based on motion measured by an accelerometer, a gyroscope and a magnetometer (compass) mixed with a GPS logging movement, orientation and location. The device outputs haptic feedback through a rumble motion, auditive feedback through a microphone and simple visual ques through the screen and skin.
The games are closely related to children’s play and sports, but its digital format allows the device to acts as an unbiased referee. It encourages everyone to be physical and have a reason to break norms. Since no graphics are used, games are simple and intuitive, enabling player rotations and a more active role for spectators.
Audio is commonly used to make games more immersive and expressive. Sound design is often added after the initial conceptual phase. Beat the Universe celebrates sound as a main mechanic and driver of the story.
All the planets in Happy Space are inhabited by singing animals. You control one of these animals. The objective on each planet is to determine which one of the local animals' humming songs is the most harmonic match to the player controlled animal's song. When a beat box ensemble is gathered - one member from each of the four planets - the player will be evaluated on how the final band fit together.
The game was created by team Rocklobster in less than 48 hours at the Nordic Game Jam 2011. It made it to the finals along with great games like Tikkiit and Johann Sebastian Joust.