Sile O'Modhrain - Research Projects
Home | Projects | Publications
Sile O'Modhrain's Research Projects 

The Holy Braille 

It has been over 30 years since there have been significant changes to the technology for creating Refreshable Braille displays. The existing piezoelectric cells are expensive and complex to manufacture, and cannot be packed tightly enough to create a full page of text. Nor are they suitable for creating tactile graphics. In this project, we take a completely new approach, using microfluidic techniques to both actuate and control bubbles to move individual braille dots. Our aim is to create a high-density refreshable tactile array at a much lower cost.

Watch Video of The Holy Braille.

Related Publications:

  • Morash, V.; Russomanno, A.; Gillespie, R. B.; and O’Modhrain, S. [2017]; “Evaluating approaches to rendering braille text on a high-density pin display,” IEEE Trans. Haptics, 2017.
  • A. Russomanno, Z. Xu, S. O’Modhrain, and R. B. Gillespie, [2017]; “A Pneu Shape Display: Physical Buttons with Programmable Touch Response,” in World Haptics Conference (WHC), 2017.
  • Russamanno, A.; O'Modhrain, S.; Gillespie, R.B.; Rodger, M.W.M., [2015]; "Refreshing Refreshable Braille Displays," in Haptics, IEEE Transactions on , vol.8, no.3, pp.287-297, July-Sept. 1 (Link to Publication)

  • O'Modhrain, S.; Giudice, N.A.; Gardner, J.A.; Legge, G.E.,[2015]; "Designing Media for Visually-Impaired Users of Refreshable Touch Displays: Possibilities and Pitfalls," in Haptics, IEEE Transactions on , vol.8, no.3, pp.248-257, July-Sept. 1 (Link to Publication)

  • Russamanno, A., Gillespie, R.B., O’Modhrain, S. and Burns, M. [2015];“The design of pressure-controlled valves for a refreshable tactile display,” in IEEE World Haptics Conference (WHC), Chicago, June .  E. (Link to Publication)

  • Russamanno, A., Gillespie, R. B., O’Modhrain, S., and Barber, J., [2014]"Modeling Pneumatic Actuators for a Refreshable Tactile Display," In proceedings: Eurohaptics (Versailles, France June 24-26) .(Link to Publication)

  • Russamanno, A., Gillespie, R. B., O’Modhrain, S., and Barber, J.,[2014] "A Tactile Display Using Pneumatic Membrane Actuators," In proceedings: Eurohaptics (Versailles, France June 24-26),.

People:

Support: This project is funded by the National Science Foundation, NSF Award Id : 1319922

Image: separator

Mesh 

Image: Picture of iPaq with Mesh Maze and Mesh hardware platform Image: Picture of Tilt Control in Mesh

Mesh is a hardware platform that integrates inertial sensing, a compass and a vibrotactile display into a back pack for a standard iPaq. The objective of Mesh is to provide a platform for prototyping examples of applications that combine haptic, visual and auditory display in a single hand-held device that is capable of communicating with other similar devices. Current examples of such applications are the Body Mnemonics, ContactIM and Topographic Torch projects detailed below.

Watch Video of Mesh.

Related Publications:

  • O'Modhrain, S., (2004). "Touch and go - designing haptic feedback for a hand-held mobile device" BT Technology Journal, Vol 22 No. 4, Kluwer Academic Publishers, October 2004 (download pdf)

  • Oakley, I. & O'Modhrain, S., (2005). "Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface" in IEEE Proceedings of the World Haptics Conference, Pisa, Italy (download pdf)

  • Hughes, S. Oakley, I & O'Modhrain, S., (2004). "MESH: Supporting Mobile Multi-modal Interfaces" in Proceedings of ACM UIST'04, Santa Fe, NM (download pdf)

  • Oakley, I., Angesleva, J., Hughes, S & O'Modhrain, S., (2004). "Tilt and Feel: Scrolling with Vibrotactile Display" in proceedings of EuroHaptics'04, Munich, Germany (download pdf)

People:

Stephen Hughes, Ian Oakley, Sile O'Modhrain
Image: separator

Body Mnemonics 

Image: Picture of Remote control car

Body mnemonics is a meta tool for portable devices that enhances their usability, shifts the interaction to the periphery of our concentration and makes them more responsive to our cultural background on the basis of three principles: proprioceptive feedback, body image, and the "method of loci" mnemonic device.

Using inertial sensing the movements of a portable device in 3D space can be tracked, analysed and referenced to the posture of the user. This enables a user to store and access information on his or her own body space. For example, online banking information could be accessed by moving the device to your back pocket. Similarly, your music archive could be located at your ear.

More information is available on the Body Mnemonics website.

Watch Video of Body Mnemonics.

Related Publications:

  • Angesleva, J., Oakley, I., Hughes, S. & O'Modhrain, S., (2003). "Body Mnemonics: Portable device interaction design concept" in proceedings of UIST'03, Vancouver, Canada (download pdf)

  • Angesleva, J., O'Modhrain, S., Oakley, I., & Hughes, S., (2003). "Body Mnemonics" in proceedings of Physical Interaction (PI03) - Workshop on Real World User Interfaces, a workshop at the Mobile HCI Conference 2003, Udine, Italy (download pdf)

People: Jussi Angesleva, Stephen Hughes, Ian Oakley, Sile O'Modhrain

Image: separator

ContactIM 

Image: Picture of ContactIM System

Following on from work such as the inTouch, ContactIM uses a touch enabled Instant Messaging system to explore the use of haptics in interpersonal communication. As well as text, audio and video, users are able to send one another touch messages. The possible messages that can be sent range from simply throwing a ball (to be caught and then returned), to more complex and personal communications such as handshakes and hugs. We hope to create a system that will provide users with a rich and expressive haptic vocabulary.

One unique aspect of this project is that it is exploring touch communication that does not take place in tightly coupled and interactive scenarios. This means that it does not require the extremely low latency links typical of haptic communication systems, and makes it ideal for use over existing communication technologies such as the Internet.

Related Publications:

  • Oakley, I. & O'Modhrain, S., (2002). "Contact IM: Exploring Asynchronous Touch Over Distance" in proceedings of CSCW 2002, New Orleans, LO (download pdf)

People:

Ian Oakley, Sile O'Modhrain
Image: separator

Topographic Torch 

Image: Picture of Topographic Torch

Topographic Torch is a handheld digital mapping tool featuring a novel egocentric interaction model enabling people to physically point at objects of interest in the world and automatically see those objects on a digital map. In this way the tool places people in an egocentric frame of reference with respect to the map and, we believe, may enhance people's ability to understand the relationships between where they are in the world and where other objects of interest are in relation to them.

Related Publications:

  • Bennett, M. and O'Modhrain, S., (2006) "Here Or There Is Where? Haptic Egocentric Interaction With Topographic Torch" To appear in Proceedings of the Workshop entitled "What is the Next Generation of Human-Computer Interaction?" CHI 2006, Montreal, Canada, April 2006

People:

Mike Bennett, Sile O'Modhrain
Image: separator

Gesture Passwords 

This project investigates the feasibility of a personal verification system using gestures as biometric signatures. Gestures are captured by low power, low-cost tri-axial accelerometers integrated into an expansion pack for palmtop computers. The objective of our study is to understand whether the mobile system can recognize its owner by how she/he performs a particular gesture, acting as a gesture signature. The signature can be used for obtaining access to the mobile device, but the handheld device can also act as an intelligent key to access services in the context of a wider communications network.

Related Publications:

  • Farella, E., O'Modhrain, S., Benini, L. & Ricco B., (2006) "Gesture Signatures for Ambient Intelligence Applications: a Feasibility Study" To appear in Proceedings of Pervasive 2006, Springer LNCS, May 2006

People:

Elisabetta Farella, Sile O'Modhrain
Image: separator

Enactive Musical Instruments 

Image: Two Pictures of PebbleBox from TouchMe exhibition Image: Two Pictures of PebbleBox from TouchMe exhibition

Many interactions with physical objects are physically or perceptually similar. For example, the experience of shaking a container of ice cubes shares many perceptual qualities with that of shaking a container of pebbles, or ball bearings. All are objects of a similar size and hardness, properties which give rise to similar auditory and haptic (inertial) percepts when they collide inside the container. Moreover, these similar physical properties also define the kinds of gestures that are possible. In the case of the example above, one can imagine reaching into the container and shuffling the objects or even removing some of them, or holding them in ones hand. In this sense we may say that the physics of an interaction defines its gesture space. In this work, which is carried out in the context of the European Union project Enactive, we seek to exploit the tacet knowledge of the behaviour of physical systems with well understood auditory and haptic percepts (collision, friction, etc.) to design new musical instruments.

Watch Video of PebbleBox.

Related Publications:

  • Essl, G. & O'Modhrain, S., (in Press). "An Enactive Approach to the Design of New Tangible Musical Instruments."

  • Essl, G., Magnusson, C., Eriksson, J. & O'Modhrain, S., (In Press) "Performance, control, and preference in physical and virtual sensorimotor integration" To appear in Virtual Reality Journal, special issue on Enactive Interfaces.

  • Essl, G. & O'Modhrain, S., (2005). "Scrubber: An Interface for Friction-induced Sounds" in Proceedings of Nime'05, Vancouver, Canada (download pdf)

  • O'Modhrain, S. & Essl, G., (2004). "PebbleBox and CrumbleBag: Tactile Interfaces for Granular Synthesis" in proceedings of NIME'04, Hamamatsu, Japan (download pdf)

People:

Georg Essl, Sile O'Modhrain
Image: separator

Touch TV 

Image: Picture of Touching Tales

Touch TV extends the notion of media content production and viewing to include a new sensory modality, the sense of touch. Recent trends in home theatre technologies point to an increasing desire by viewers to feel immersed in the content they are viewing. Touch TV provides a further opportunity to enhance this experience by simulating a physical link between the viewer and the content. By creating a touch-enabled TV remote control and an innovative series of children's cartoons, "Touching Tales", designed from the ground up to be felt as well as heard and viewed, we suggest that an interactive viewing experience could be greatly enriched by providing the ability to feel the motion of objects in a seen and even control the movement of these objects, increasing the realism of the interaction. Furthermore, in "Touch Football", we demonstrate methods for gathering content in realtime via the equivalent of a camera for touch.

Watch Video of Touch TV.

Related Publications:

  • O'Modhrain, S. & Oakley, I., (2004). "Adding Interactivity: Active Touch in Broadcast Media" in proceedings of the 12th Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, Chicago, Il (download pdf)

  • Oakley, I. & O'Modhrain, S., (2003). "Cross-modal perception of motion-based visual-haptic stimuli" presented at the International Multisensory Research Fourm 2003 (IMRF 2003), Hamilton, Canada (download presentation ppt)

  • O'Modhrain, S. & Oakley, I., (2003). "Touch TV: Adding Feeling to Broadcast Media" in proceedings of the European Conference on Interactive Television: from Viewers to Actors?, Brighton, UK. Awarded best paper in conference (download pdf)

People:

Ian Oakley, Andy Brady, Cormac Cannon, Stephen Davies, Catherine Little, Andrea Chew, Sile O'Modhrain
Image: separator

Relay 

Image: Picture of Remote Control Car

Relay explores the concept of conveying the movement of a distant vehicle via haptic feedback to a device held in the observers hand. The project was developed to explore two scenarios, namely to provide feedback about the motion of a car in a broadcast of a formula 1 race, and secondly to provide additional cues in a vehicle teleoperation scenario.

Users engaged in Telemanipulation - the remote control of mechanical devices - typically rely on visually presented information to perform their tasks. Often they observe the actions of the device they are controlling in person, or use device-mounted cameras to provide a view of its activities. However, as the majority of telemanipulation tasks are of a physical nature, vision is arguably an unsuitable modality for the sole presentation of information in this domain. We suggest that haptics, or the sense of touch, may add significantly to the quality of a users interaction by allowing them to more directly experience the physical environment encountered by the device. We believe a sense of feeling would benefit the experience.

We use a radio controlled toy car to demonstrate this concept. Radio controlled vehicles are traditionally interfaced with a controller equipped with spring loaded levers for forward/reverse and left/right. The operator relies on observing the vehicle in order to make the decisions needed to steer it. The controlling interface offers no tactile information to the user about what forces the vehicle is 'experiencing'. We aim to enhance the interactivity and improve the control of the vehicle by developing a handset controller which enables the user to actually feel the forces the vehicle is experiencing as it accelerates, brakes and turns.

To achieve this we have mounted accelerometers in the vehicle to measure the fluctuations in tilt, roll and yaw resulting from driving. This information about the terrain the car is traveling over is relayed to the handset via a radio link and is translated into displacement of an actuated plate in the handset.

Handheld Haptic Display for Relay

This project addresses the issues of ungrounded haptic display. Force feedback devices are typically mounted to a solid object thus grounding the forces they exert. This ensures the device can output clearly defined haptic feedback. In the case of handheld devices the individual holding the device (and experiencing the forces) is also grounding the device. This can cause significant difficulties in perceiving the forces presented. This project looks at novel technologies for ungrounded haptic display. In its latest incarnation a pair of movable plates have been arranged parallel to each other. The upper plate is grounded by the heel of the hand and the lower (moving) plate is felt by the fingertips. The moving plate can be actuated to provide force displacement giving a compelling illusion of force feedback in a handheld ungrounded device.

Related Publications:

  • Hughes, S., Oakley, I. Brady, A. & O'Modhrain, S., (2003). "Exploring Dynamic Haptic Cues in Vehicle Teleoperation" in proceedings of EuroHaptics 2003, Dublin, Ireland (download pdf)

  • Brady, A., MacDonald, B., Oakley, I., Hughes, S., O'Modhrain, S., (2002). "RELAY: A Futuristic Interface for Remote Driving" in proceedings of EuroHaptics 2002, Edinburgh, UK (download pdf)

People:

Andy Brady, Ian Oakley, Stephen Hughes, Sile O'Modhrain
Image: separator

EpipE 

Image: Picture of EpipE

A musician's performance is intimately bound up with their physical means of expression. The instrument constrains, but its unique character colours and informs the performance. Until now, much of the focus in musical interface design has concentrated on the keyboard. The wind and string communities have been particularly poorly served - the more 'organic' nature of their interaction with their instrument making it much harder to measure meaningfully.

The goal of this project is to develop a very palpable interface based on the Irish Uilleann Pipes, a polyphonic reeded woodwind with a complex (some might say bizarre) interface. The EpipE interface will use a combination of capacitive and optical sensing technologies to provide continuous tonehole control - an element missing from existing electronic woodwinds and essential for expressive performance - and output a set of parameters which may be used to drive physically-based instrument models in a performance context and ultimately to generate a MIDI control stream allowing it to be used with the huge base of existing music software.

Related Publications:

  • Hughes, S. Cannon, C. O'Modhrain, S., (2004). "Epipe : A Novel Electronic Woodwind Controller" in proceedings of NIME'04, Hamamatsu, Japan (download pdf)

  • Cannon, C., Hughes, S. & O'Modhrain, S., (2003). "EpipE: Exploration of the Uilleann Pipes as a potential controller for computer-based music" in proceedings of New Interfaces for Musical Expression (NIME'03), Montreal, Canada (download pdf)

People:

Cormac Cannon, Stephen Hughes, Sile O'Modhrain
Image: separator

Handling Perspective: Cross-modal disparity between real and virtual representations of objects. 

Image: Picture of handling perspective

As the boundaries between real and virtual environments become blurred and we begin to use real objects as handles for objects in virtual scenes, interesting questions relating to the relationship between real objects and their virtual counterparts arise.

In this project we address one such question: What is the effect of cross-modal disparity in "field of view" on our ability to make the link between the behaviour of an object we are holding in the real world and viewing in a virtual environment. To address this question, we designed a series of tasks in which we manipulated the relationship between haptic properties (orientation, number of parts etc.) of the real-world object and visual properties of its virtual counterpart.

Related Publications:

  • Woods, A., O'Modhrain, S. & Newell, F., (2004). "The effect of temporal delay and spatial differences on cross-modal object recognition." Cognitive, Affective and Behavioral Neuroscience, Vol 4, No.1 PP260-269, Psychonomic Society Publications, June 2004

People:

Andy Woods, Fiona Newell (Trinity College Dublin), Sile O'Modhrain
Image: separator
Home | Projects | Publications
Sile O'Modhrain   Sonic Arts Research Centre, Queens University, Belfast