SUMMARY

This is a list of the most significant projects I worked in the past few years. For more details, please refer to Google scholar page.

PORTFOLIO

FUTUREPLAY CREATIVE LAB PROJECTS

LISTEN TO YOUR FOOTSTEPS

We present a new interaction technique, called Touch+, which expands touch input vocabulary by using both mobile devices and wrist-worn devices. This technique enables to recognize twisting, rolling, and lifting of a finger while it is touching the device. Moreover, our system differentiates between a fast touch (release) and a normal touch (release). This is achieved by calculating the relative differences in a movement speed and an angle between the smartphone and the smartwatch. To illustrate the potential of our approach, we developed a set of possible applications. We believe that Touch+ will open a large area for designing input interactions of mobile devices by combining wrist-worn devices.


HARMONIOUS HAPTICS

Smartwatches now allow information to be conveniently accessed directly from the user’s wrist. However, the smartwatches currently available in the market offer a limited number of applications. In this paper, we propose a new interaction technique named Harmonious Haptics, which provides users with enhanced tactile sensations by utilizing smartwatches as additional tactile displays for smartphones. When combined with typical mobile devices, our technique enables the design of a wide variety of tactile stimuli. To illustrate the potential of our approach, we developed a set of example applications that provide users with rich tactile feedback such as feeling textures in a graphical user interface, transferring a file between the tablet and the smartwatch device, and controlling UI components.


TOUCH+

We present a new interaction technique, called Touch+, which expands touch input vocabulary by using both mobile devices and wrist-worn devices. This technique enables to recognize twisting, rolling, and lifting of a finger while it is touching the device. Moreover, our system differentiates between a fast touch (release) and a normal touch (release). This is achieved by calculating the relative differences in a movement speed and an angle between the smartphone and the smartwatch. To illustrate the potential of our approach, we developed a set of possible applications. We believe that Touch+ will open a large area for designing input interactions of mobile devices by combining wrist-worn devices.


CONTEXTUAL DRAG

We present a novel dragging interaction technique, called Contextual Drag, which dynamically changes the friction of the dragging movement according to the context (e.g., type of content presented, density of points of interest in a map, and frequency of usage of items in a list). We also suggest a number of dragging effects such as snapping, shaking (changing trajectory of motion), and zooming. To explore the potential of this idea, we implemented a prototype and suggest a series of possible applications. We believe that this simple and novel technique enables users to navigate digital content more effectively than motions with constant friction.


WYNC

We introduce a wrist-worn sensor based synchronization technique, called Wync, which enables various input devices to recognize who is interacting with. This is achieved by comparing the physical movement data from the wrist-worn device with the data from different sensors (e.g., touchscreen, camera, and etc) when user operates physical interactions (e.g., touching, dragging, pinching and hovering in the air). To verify our idea, we present a series of concept scenarios using Wync: security enhanced password input, slide to login, and copy and paste. We believe that Wync will open a large area for synchronization with wrist-worn devices.




PH.D PROJECTS

PAPYLOOKS

PapyLooks is an eBook system for mobile devices, which facilitates user's gaze to enhance the reading experience. The system provide users with visual, auditory, tactile feedbacks asociated with the texts that are currently being read by the user. We explored the effects of 18 different types of feedbacks we designed. We also suggest several eye-based interactions for eBook application. Sponsored by HRHRP KAIST.


MAGCUBES

MagCubes are tangible widgets for children that work both on and around mobile devices. The advantage of this technique is that it is simple, battery-free, and inexpensive because it solely relies on a magnetometer, which is already installed in modern smart devices. To motivate our approach, we suggest various applications using a MagGetz toolkiT. The first application is a board game with a magnetic dice. With this application, users can input numbers by placing the dice in the specific area after throwing it. The second application is a simple drawer with a color picker cube. The third application is a math learning game with five different sizes of cubes. (The strength of the magnetic force is proportional to the size of the cubes).


LIQUIDESK

We introduce a liquid-based interactive tabletop system, LiquiDesk, which is a tabletop aquarium with an array of water pumps installed. The water pump array provides rich tactile feedback to the users and also interesting visual feedback by deforming the water surface. As a prototype application, we introduce a game "Digital Doctor Fish". Doctor fishes, Garra rufa, live in the pools of some Turkish river systems and hot springs, and have been integrated as a spa treatment, and now, we are putting them to the digital domain. When you touch the water surface of LiquiDesk, virtual doctor fishes gather to the touch point and make you feel like getting your reddish logs sucked. Plus, your can catch a fish (if you can), push fishes to the corner, or even attack them.


NAILSENSE

In this paper, we propose a new interaction technique, called NailSense, which allows users to control a mobile device by hovering and slightly bending/extending fingers behind the device. NailSense provides basic interactions equivalent to that of touchscreen interactions; 2-D locations and binary states (i.e., touch or released) are tracked and used for input, but without any need of touching on the screen. The proposed technique tracks the user's fingertip in real-time and triggers event on color change in the fingernail area. It works with conventional smartphone cameras, which means no additional hardware is needed for its utilization. This novel technique allows users to use mobile devices without occlusion which was a crucial problem in touchscreens, also promising extended interaction space in the air, on desktop, or in everywhere. This new interaction technique is tested with example applications: a drawing app and a web browser. Sponsored by FuturePlay Inc.


MAGGETZ

This paper proposes user-customizable passive control widgets, called MagGetz, which enable tangible interaction on and around mobile devices without requiring power or wireless connections. This is achieved by tracking and ana-lyzing the magnetic field generated by controllers attached on and around the device through a single magnetometer, which is commonly integrated in smartphones today. The proposed method provides users with a broader interaction area, customizable input layouts, richer physical clues, and higher input expressiveness without the need for hardware modifications. We have presented a software toolkit and several applications using MagGetz.


VIBPRESS

This paper introduces VibPress, a software technique that enables pressure input interaction on mobile devices by measuring with the built-in accelerometer the level of vibration absorption when the device is in contact with a damping surface (e.g., user’s hands). This is achieved using a real-time estimation algorithm running on the device. Through a user evaluation, we provide evidence that this system is faster than previous software-based approaches, and accurate as hardware-augmented approaches (up to 99.7% accuracy). With this work, we also provide an insight about the maximum number of pressure levels that users can reliably distinguish, reporting usability metrics (time, errors and cognitive load) for different pressure levels and for each different type of gripping gestures (press and squeeze).


MAGPEN

This paper introduces MagPen, a magnetically driven pen interface that works both on and around mobile devices. The proposed device introduces a new vocabulary of gestures and techniques that increase the expressiveness of the standard capacitive stylus. These techniques are: 1) detecting the orientation that the stylus is pointing to, 2) selecting colors using locations beyond the screen boundaries, 3) recognizing different spinning gestures associated with different actions, 4) inferring the pressure applied to the pen, and 5) uniquely identifying different pens associated with different operational modes. These techniques are achieved using commonly available smartphones that sense and analyze the magnetic field produced by a permanent magnet embedded in a standard capacitive stylus. This paper explores how magnets can be used to expand the design space of current pen interaction, and proposes a new technology to achieve such results.


MAGNETIC MARIONETTE

In this paper, we present the Magnetic Marionette, a magnetically driven elastic controller that enables tangible interaction on mobile devices. This technique can determine eight different gestures in excess of 99% accuracy by sensing and tracking the magnets embedded on the controller. The advantage of this technique is that it is lightweight, battery-free, and inexpensive because it uses a magnetometer, which is already embedded in smart phones today. This simple and noble technique allows users to achieve richer tactile feedback, expand their interaction area, and enhance expressiveness without the need for hardware modification.


VIBROTACTOR

In this paper, we present a low-cost placement-aware technique, called VibroTactor, which allows mobile devices to determine where they are placed (e.g., in a pocket, on a phone holder, on the bed, or on the desk). This is achieved by filtering and analyzing the acoustic signal generated when the mobile device vibrates. The advantage of this technique is that it is inexpensive and easy-to-deploy because it uses a microphone, which already embedded in standard mobile devices. To verify this idea, we implemented a prototype and conducted a preliminary test. The results show that this technique can detect 12 different real-world placement sets at 91% accuracy.

Patented (KR) > Published (IUI2013)


PSEUDO BUTTON

We propose a new interaction technique, called PseudoButton, which emulates a pressure-sensitive touch sensor by repurposing a built-in microphone on mobile devices. This simple and novel technique increases input expressivity of the device and expands its interaction area for users to alleviate the occlusion problem caused by touchscreens without adding extra sensors. To verify our idea, we implemented a prototype and conducted a preliminary evaluation on it. The results show that participants can input at accuracy of 94% for five different pressure levels with minimal error.


MICPEN

This paper introduces MicPen, a low-cost pressure-sensitive stylus pen interface for standard touchscreen displays that uses a microphone to estimate the amount of pressure applied to the pen. This is achieved by filtering and analyzing the acoustic signal generated when the tip of the pen is rubbed on the touchscreen. The advantage of this approach is that it is inexpensive, reliable and suitable for mobile interaction because it does not require mechanical parts to sense the input pressure. Results from a user study shows that the participants recognized five out of ten different pressure levels with perfect accuracy, and nine out of ten with minimal error.

TOUCHCUP

Physical interface elements provide intuitive tactile clues so that user can conduct low-attention and vision-free interactions. However, many conventional devices in these days have a flat touchscreen which lacks tactile feedback. In this paper, we present a new interaction technique, called TouchCup, which brings beneficial tactile feelings while retaining the flexibility and expressiveness of touchscreen. We implemented three different types of suction cup shaped buttons and conduct a preliminary experiment with a virtual button under a standard touchscreen and a touchscreen with TouchCups added.


EXMAR

There have been many studies to minimize the psychological and physical load increase caused by mobile augmented reality systems. In this paper, we propose a new technique called "EXMAR", which enables the user to explore his/her surroundings with an expanded field of view, resulting in a decrease of physical movement. Through this novel interaction technique, the user can explore off-screen point of interests with environmental contextual information by simple dragging gestures.




MASTERS PROJECTS

FORCETOUCH

ForceTouch technique increases input vocabulary of the touchscreen by utilizing a force property of touch interaction. This is achieved by a simple algorithm that combines the events from the touchscreen with the movement data from a built-in accelerometer. The proposed technique allows a touchscreen to distinguish touch inputs with different force. This technique have been invented and filed earlier than other accelerometer determined input velocity techniques.


KOREANSHAPER

In this paper, we present a Korean text entry system that takes advantage of multi-touch capabilities of modern touchscreen devices. The proposed interaction is inspired by the shape and structure of the Korean script so that users can intuitively understand. We proposed this system consisting of the buttons only for consonants of the Korean alphabet to minimize the number of buttons. To resolve the low input resolution and occlusion of finger, the center point of two fingers was designed as the selective consonant and the dragging guideline were proposed for improvement of the input accuracy.


DISTORABLE BAR

In this paper, we present a new interaction technique, called DistortableBar, which enables intuitive and effective manipulation of multimedia contexts by using dynamic shape and sale distortion. This unique stream visualization and control method is smoothly integrated into a single timeline interface and let users alleviate Fat finger problem while also retaining the stream’s entire context. We also introduced possible applications using DistortableBar and benefits of them.


EARFACE

We present a new interface, called Earface, which looks a typical earphone but has additional function that uses teeth gestures to interact with. This is achieved by filtering and analyzing the bone conductive acoustic sound from teeth gestures such as localized or rhythmic snapping or gnashing teeth. This technique enables users free their hands, lets users input without exposing to avoid shoulder surfing, and is also socially acceptable. We designed six different snapping gestures to control music player and conducted a preliminary test of it. We believe that this technique could be applied to various type of wearable devices (e.g., Google glass or headsets).


EMPRESS

We designed Pressure Sensitive Keyboard as an interface not only to type a word but also express user's emotions. The proposed Interface has several visual feedbacks : slapping, punching, kissing, and sleeping. Each visual animations are presented on a specific area of the screen by several input gestures : hitting, stroking and even sleeping on a keyboard. The Hardware, Pressure Sensitive Keyboard is supported by Microsoft Research Center. Implemented using Processing and Pressure Sensitive Keyboard from Microsoft Research.


HAPTIC WHEEL

We propose a vibro-tactile based display on a steering wheel named "Haptic Wheel", which is embedded with 32 linear actuators to present tactile information such as "Alert", "Turn left" and "Turn right". This information is coded by spatial and temporal patterns, resulting in transferring the information regardless of the holding gesture and the number of hands gripping the steering wheel. This noble interface allows users to get information through the physical contact of the driver’s palm, a part of the body that is free from objects such as clothes or shoes. We conducted an experiment to evaluate three proposed stimuli modes comparing two and one handed use. This study gives a meaningful indication that employing the tactile display on the steering wheel may give directional information to the driver.


VIRTUAL THUMB

We present VirtualThumb, a set of software-based conceptual techniques that enables a user to handle multi-touch operations under the limited physical resources available. This simple and novel technique allows users to comfortably handle multi-touch pinching gestures such as zooming, scaling, and rotating using only one finger. Hence, occlusions caused by two fingers are reduced by half, gestures are not restricted by the physical boundaries of the screen, and physical limitations of the user's fingers are reduced.


BIOLIN

We describe a media artwork that features the Biolin, a musical device that produces different sounds depending on the target object that it is being played on. Shaped like an ordinary violin bow, the Biolin analyzes the target using a weak electric current to produce a timbre that matches the target. The user can then perform by "playing" the target object using the Biolin. The Biolin was designed by modifying a violin bow in order to make it conductive and connected to a computer for the transmission of data, resulting in visual and auditory output. We showcased Biolin in front of a small audience, in which the users showed positive reactions to the new approach of making sounds from everyday objects and people around us.


MY GREEN PET

We propose a new method for interacting with both organisms and objects, using the change in current when a user conducts a gesture on the target of interaction, ultimately widening the spectrum of the object’s reaction based on the user’s behavior. For our installation, we used a plant as the main target of interaction. Implemented using Arduino and Processing.




COOPERATIVE PROJECTS

BANDSENSE

In this paper, we propose a new interaction technique, called BandSense, which allows pressure-sensitive multi-touch interaction on a wristband. The proposed method provides users with a broader interaction area and higher input expressiveness, enabling a precision interaction with a less occlusion. To illustrate the potential of our approach, we present a series of example applications with several input vocabularies. We also describe the overall architecture of our system. We believe that our technique would greatly help users control a smartwatch easily and conveniently.

Hosted by Impressivo Inc.

SNAP

Prototyping is essential part in the initial state of the software development process, especially for design exploration. However, most designers feel it difficult to prototype dynamic interactions rather than designing static UIs, since they do not have any programming experiences. Moreover, it is even harder to communicate regarding interactions with developers. To address this issue, we presented a novel prototyping tool, called SNAP, which enables designers to implement sensor-based interactions. Our concept was drawn from a natural language programming and microinteraction model. Hence, it does not require any programming skills or schemes. We believe that our approach will shed a light on prototyping works in the design process.

Hosted by Studio XID Inc.

TALKEY

The major usage of the smartwatch is to check the notifications from the connected smartphone at a glance, without taking a look at the phone in the user’s pocket. However, a user still needs his or her smartphone to do further actions such as writing the reply messages for the incoming ones. In this paper, we present a novel approach by suggesting relevant reply messages, utilizing conversation data on SNSs. With Talkey, the user can reply to the daily incoming messages simply by touching one of the suggested responses from our system, with his or her smartphone still being placed in the user’s pocket. This paper addresses the prototype of Talkey, how it collects and analyzes conversation data, and what will be done in the next step.

Hosted by Fluenty Inc.

CHILI

Chili is a mobile video call system with viewpoint control of the remote scene and augmentation of the live video with freehand drawing. To enable the viewpoint control with ordinary mobile devices, Chili uses visualization and human action instead of any additional mechanics or optics, and also exploits remotely switching between the front and rear cameras. The drawings are world-stabilized by vision-gyroscope sensor fusion, and the users can draw even larger than the field-of-view by moving the device itself.

Hosted by Hyungeun Jo

POKE

Poke is a way of sharing emotional touches over phone calls. It delivers touches through an inflatable surface on one side of the phone and receives finger-pressure inputs on the opposite side of the phone, all while allowing callers to maintain a conventional phone-calling posture. Poke provides three key user interactions (poke, poke and vibrate, and poke back) and delivers affective touches through its inflating patterns and vibrations on the top of the inflatable surface. This opens possibilities for developing pleasant, affective tactile languages over phone calls.

Hosted by Young-woo Park

AROUND PLOT

Aroundplot is an overview interface for off-screen objects in 3D environments. The technique consists of two part. One part of this technique is a mapping method from 3D spherical coordinates to a 2D orthogonal fisheye, which tackles the problems of existing 3D location cue displays such as occlusion among the cues and discordance with the human frame of reference. The other part is a dynamic magnification method that magnifies the context in the direction the view is moving to alleviate the distortion of the orthogonal fisheye and thus to support precise movement. In particular, the magnification method can be generalized for applications with other dimensions than 3D, such as document scrolling or map panning.

Hosted by Hyungeun Jo



PUBLICATIONS

GRADUATION THESIS

S. Hwang, "PseudoSensor: Emulation of Input Modality by Repurposing Sensors on Mobile Devices", Doctor's thesis, Graduate School of Culture Technology, KAIST, 2015

S. Hwang, "Design of Vibro-tactile based Information Presentation for Vehicle Navigation", Master's thesis, Graduate School of Culture Technology, KAIST, 2010

S. Hwang, "Study of P-Grid System for Generating Realistic Representation of 3D Avatar", Bachelor's thesis, Department of Computer Science, Kwangwoon Univ, 2007

YEAR 2015

S. Hwang, J. Gim, "Listen to Your Footsteps: Wearable Device for Measuring Walking Quality", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI EA), 2015

T. Kim, S. Hwang, J. Gim, "SNAP: Sensor Aid Prototyping Tool for Designers", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI EA), 2015

S. Hwang, J. Song, J. Gim, "Harmonious Haptics: Enhanced Tactile Feedback Using a Mobile and a Wearable Device", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI Interactivity), 2015

Y. Ahn, S. Hwang, H. Yoon, J.H. Ryu, "BandSense: Pressure-sensitive Multi-touch Interaction on a Wristband", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI Interactivity), 2015

S. Hwang, J. Song, J. Gim, "TOUCH+: Expanding Touch Input Vocabulary using a Smartphone and a Smartwatch", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI VideoShowcase), 2015

S. Hwang, J. Gim, J. Yoo, A. Bianchi, "Contextual Drag: Context-based Dynamic Friction for Dragging Interaction", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI VideoShowcase), 2015

S. Hwang, J. Song, J. Gim, "MagCubes: Magnetically Driven Tangible Widgets for Children", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI VideoShowcase), 2015

S. Hwang, K.W. Wohn, "PseudoSensor: Emulation of Input Modality by Repurposing Sensors on Mobile Devices", Journal of Intelligence and Smart Environments (SCIE), 2015

S. Hwang, K.W. Wohn, "Desining Magnetically Driven Interfaces on Conventional Mobile Devices", International Journal of Human Computer Studies (SCI), 2015 - (On Revision)

S. Hwang, J. H. Ryu,"Vibro-tactile based Information Presentation on a Steering Wheel for Navigation", Journal of Intelligence and Smart Environments (SCIE), 2015 - (On Submission)

YEAR 2013

S. Hwang, D. Kim, S.W. Leigh, K.W. Wohn,"NailSense: Fingertip force as a new input modality", ACM Symposium on User Interface Software and Technology (UIST Poster), ST Andrews, UK, 2013

S. Hwang, M. Ahn, K.W. Wohn, "MagGetz: User Configurable Tangible Controllers On and Around Mobile Devices", ACM Symposium on User Interface Software and Technology (UIST), ST Andrews, UK, 2013

S. Hwang, A. Bianchi, K.W. Wohn, "VibPress: Enabling Pressure-Sensitive Interaction using Vibration Absorption on Mobile Device", International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), Munich, Germany, 2013

S. Hwang, A. Bianchi, M. Ahn, K.W. Wohn, "MagPen: Magnetically Driven Pen Interactions On and Around Mobile Device", International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), Munich, Germany, 2013

H. Jo, S. Hwang, "Chili: View Control and Gestural Communication for Mobile Video Calls", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHIEA), Paris, 2013

S. Hwang, K.W. Wohn, "VibroTactor: Low-cost Placement-Aware Technique using Vibration Echoes on Mobile Devices", ACM International Conference on Intelligent User Interfaces (IUI Poster), Santa Monica, CA, 2013

S. Hwang, M. Ahn, K.W. Wohn, "Magnetic Marionette: Magnetically Driven Elastic Controller on Mobile Device", ACM International Conference on Intelligent User Interfaces (IUI Poster), Santa Monica, CA, 2013

J.T Kim, S. Hwang, K.W. Wohn, "A Study on the Possibility of User-Context based Deformable Mobile Display", HCI Society of Korea (HCIKOREA), Korea, 2013

S. Hwang, M. Ahn, K.W. Wohn, "Method for Recognizing Tangible Controllers using magnetometer on Mobile Devices", HCI Society of Korea (HCIKOREA), Korea, 2013

YEAR 2012

S. Hwang, S. Kim, K.W. Wohn, "KoreanShaper: One-handed Multi-finger Gesture based Korean Inputting System", 10th Asia Pacific Conference on Computer Human Interaction (APCHI Poster), Matsue, Japan, 2012

Y.W. Park, S. Hwang, T.J. Nam, "Poke: Emotional Touch Delivery", Demo Hour, ACM Interactions Magazine, Volume 19 Issue 3, May + June 2012

S. Hwang, Bianchi, A., K.W. Wohn, "MicPen: Pressure-Sensitive Pen Interaction Using Microphone with Standard Touchscreen", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI EA), Austin, Texas, 2012

S. Hwang, K.W. Wohn, "Pseudo Button: Enabling Pressure-Sensitive Interaction by Repurposing Microphone on Mobile Device", The ACM SIGCHI Conference on Human Factors in Computing Systems (CHI EA), Austin, Texas, 2012

YEAR 2011

H. Jo, S. Hwang, H. Park, J. H. Ryu, "Aroundplot: Focus+Context Visualization for Off-screen Objects in 3D Environments", Computers & Graphics Journal, 2011

Y.W. Park, S. Hwang, T.J. Nam, "Poke: Emotional Touch Delivery through an Inflatable Surface over Interpersonal Mobile Communications", ACM Symposium on User Interface Software and Technology (UIST Poster), Santa Barbara, CA, 2011

Y.W. Park, S. Hwang, K. Lee , "EmPress: Intuitive Emotional Communication over Pressure Sensitive Keyboards", HCI Society of Korea (HCIKOREA), Korea, 2011

S. Hwang, J. H. Ryu, "Clickable Touch: Physical Click Feedback on Touchscreen Interfaces", HCI Society of Korea (HCIKOREA), Korea, 2011

YEAR 2010

S. Hwang, J. H. Ryu,"The Haptic Wheel: Vibro-tactile based Navigation for the Driving Environment", IEEE International Conference on Pervasive Computing and Communications (PerCom) Workshop on SmartE, Mannheim, Germany, 2010

S. Hwang, S. Kim, Y. Park, C. Lim, "Development of Multi-touch based Korean inputting system and Evaluation", HCI Society of Korea (HCIKOREA), Korea, 2010

S. Hwang, K. Lee, W. Yeo, "My Green Pet: A Current-based Interactive Plant for Children", International Conference on Interaction Design and Children (IDC), Barcelona, Spain, 2010

S. Hwang, J. H. Ryu, "EXMAR: EXpanded view of Mobile Augmented Reality", The IEEE International Symposium on Mixed and Augmented Reality (ISMAR Poster), Seoul, Korea, 2010

YEAR 2009

Y. Park, S. Hwang, "A Study on the User Interaction Method for Web Browsing in Mobile Phone", Korea Society of Design Science (DESIGNKOREA), Seoul, Korea, 2009

S. Hwang, K. Lee, C. Lim, "VirtualThumb : One Handed Multi-Touch Emulation on Small Devices", ACM Symposium on User Interface Software and Technology (UIST Poster), Victoria, Canada, 2009

S. Hwang, K. Lee, D. Park, W. Yeo, "Biolin : Current-based Collaborative Musical Interface", International Conference on Advances in Computer Entertainment Technology (ACE), Athens, Greece, 2009

S. Hwang. K. Lee, W. Yeo, "Introducing Current-based Interactive Plant", ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia (SIGGRAPHASIA Poster), Yokohama, Japan, 2009

YEAR 2008

G. Go, S. Han, J. Lee, Y. Choi, B. Sun, Y. Park, D. Jung, S. Hwang, C. Lim, "Hanmadang : Entertainment Systems for Massive Face-to-face Interaction", International Conference on Advances in Computer Entertainment Technology (ACE), Yokohama, Japan, 2008



COPYRIGHT © 2012
SUNGJAE HWANG