| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Teams and Proposals

Page history last edited by Akshay Goil 12 years, 7 months ago

 

This page is dedicated to team formation and proposals for the class project.

 

If you have an idea for a project, describe it below using the same format as the example project (Anti-Barking Vest for Dogs). Feel free to copy/paste. If you are looking for a project to join, browse the project descriptions and enter your name under the 'Students interested in the project'. 

 


 

Project Title: Anti-Barking Vest for Dogs (Example Project - Use it as a Template)

 

Description: Several solutions aimed at preventing dogs from incessant/nuisance barking are available on the market. They are typically based on the idea of (1) delivering electric shocks to the dog's neck through an instrumented collar or (2) spraying a substance such as citronella around the dog's nose area. Depending on the dog and situation, these approaches might work, but they are not perceived as humane. I propose to build a dog vest with vibration motors that activate whenever it senses when dogs are barking continuously. The hypothesis is that the discomfort caused by the vibration close to the dog's body will cause the animal to redirect its attention away from the source of barking.

 

Proposed by: Edison Thomaz

 

Students interested in the project:

 

1. Santiago Botero

2. Tomo Holica

3. Uma Vigota

4. Juka Luca

 


 

Project Title: BrailleTouch: Touch Typing on a Touch Screen

 

Description: Eyes-free text entry, or "touch typing" is a challenge on mobile devices. There is no current solution that rivals the efficiency or ease of use of a full-sized QWERTY keyboard. Mobile eyes-free text entry is especially a challenging for visually impaired users on the soft keyboards found on touch screen devices, such as the iPhone. Researchers at Georgia Tech have developed prototypes for a soft chorded keyboard on the Apple iPhone and iPad, based on the standard 6-key keyboard that blind people use to type braille. 

 

There are three teams working on different aspects of this project. If you are interested, please sign up below under the team you are interested in, and include your email address, and CS or ID to indicate your background.

 

Proposed by: Caleb Southern (caleb.southern@gatech.edu)

 

BrailleTouch Team 1: Smartphone - Typing on the Back

In this project, we will examine this technology for adoption by sighted people, with the purpose of replacing the soft mini-QWERTY keyboards commonly found on touch screen smart phones and tablets. We will investigate the implications of eyes-free typing while on the go, and the potential for placing a multi-touch soft keyboard on the rear of the mobile device, so as not to occlude the limited screen real estate on the front. What does it mean to be able to touch type on your smartphone while on-the-go, and with your visual attention focused elsewhere?

 

This team could benefit from an Industrial Design student, especially if we prototype a solution for typing on the back case of a smartphone.

 

Students interested in the project

1. James O'Neill 

2. Kevin Kraemer 

3. Sunil Garg

4. Travis Wooten

Alternates:

5. 

6. 

 

 

BrailleTouch Team 2: Touch Typing on Flat Glass (Braille on a Tablet)

The previous version of the BrailleTouch soft chorded keyboard on the Apple iPad implements a rigid, inflexible interaction. The user must press-and-hold to set his or her keyboard location on the screen, and then continue to remember the exact screen location where each virtual key is located on the flat glass, with no tactile or visual clues. This limitation results in a frustrating typing experience for many users, whose natural inclination is to allow their hand locations to drift over time, resulting in erroneous and unrecognized keystrokes. In this project, we will explore interaction techniques to overcome these obstacles, and to improve the HCI issues and user experience for BrailleTouch on the tablet form factor.

 

One idea we will explore is to reverse the normal finger action when typing. Currently, the rest position is to hover your hands over the glass, while not touching the screen. This makes it difficult to remember where the virtual keys are located. Imagine, conversely, that in the rest position, the user has all of her fingers resting on the glass touchscreen. What if typing was done by lifting fingers off of the screen, rather than the conventional action of tapping the screen? Other features might include simply allowing the thumbs, or the palms, to remain on the glass, while typing normally with the other fingers. Doing this would also help the user to maintain a consistent location over the virtual keyboard on the glass, with visual or tactile aids. Additional interaction improvements might include audio feedback to help the user better improve their typing accuracy on flat glass.

 

Another solution is to build a more intelligent keyboard -- one that can infer what characters the user meant to type on flat glass. In this proposal, there is no need to set a fixed location for the six virtual soft keys on the Braille keyboard. The software would examine the users' multiple chorded touch events, and infer the intended keystroke, regardless of where the user's hands were on the touchscreen. One strategy would be to implement a moving average of key locations, allowing the computer to follow the user's hands as they drift from a starting keyboard location over time. Other strategies could involve various heuristics and geometric analyses, or even machine learning approaches. Yet another approach would be to wait for the space character marking the end of each work, and then compute the most likely characters intended by the user through the use of an English language dictionary. The end result of this solution would be that the user could touch type Braille chords freely anywhere on the surface of a tablet, and the machine would correctly interpret their keystrokes.

 

Students interested in the project

1. Joseph Lin

2. Harish Kothandaraman 

3. Vignesh Somasundaram  

4. Sudarsanan Krishnan

 

Alternates:

5.

6.

 


 

Project Title: Supporting Remote Play Between Children

 

Description: This project focuses on designing technologies that make it possible for children to engage in free play remotely over videochat. One possible avenue is creating a physical interface to make it easier for children to initiate the play with a remote play (a physical buddy list of sorts). There is also room for other ideas for enriching remote free play between children. Please, contact lana@cc.gatech.edu for more information.

 

Proposed by: Lana Yarosh

 

Students interested in the project:

 

1. Saie Deshpande

2. Madhura Bhave

3. Madison Berger (ID)

 


 

Project Title: Detecting Swelling Using Conductive Stretchable Fabric

 

Description: Congestive heart failure (CHF) is a serious heart disease characterized by the weakening of the heart muscle. One of the symptoms of CHF is fluid retention and weight gain, due to kidney malfunction. As a result, individuals with CHF may experience swelling of the ankles, legs and abdomen. The goal of this project is to develop a wearable device in the form of a band or compression sock that can "sense" how stretched it is, and thus, serve as an early-warning indicator of CHF. This could be accomplished with a conductive, stretchable fabric interfaced with a processing board, additional sensors and/or a mobile phone for communication and notification.

 

Proposed by: Edison Thomaz

 

Students interested in the project:

 

1. 

2. 

3. 

4. 

 

 


 

Project TitleBrain-Mobile Phone Interface for Creative Arts Therapy

 

Description: Creative arts therapy uses creative expression as a complimentary healing method. This project integrates neurofeedback into the creative process to teach patients how to regulate physiological processes, such as stress, while also engaging them creatively. Working on this project, there is opportunity to explore how neurofeedback can be integrated into the creative process (i.e. using EEG data to modulate stylistic elements such as brush stroke, color, etc.), design artworks that respond to data from brainwave sensors (artworks that change based on how you think or feel), and inventing new ways of interacting with digital (and possibly physical) artworks through brain computer interfaces.

 

 

Proposed by: Nick Davis (nicholas.davis10@gmail.com)

 

Students interested in the project:

 

1.

2. Megha Sandesh - msandesh3@gatech.edu

3. 

4. 

 


 

Project TitleDancing On Desktops (Gesture Modeling)

 

Description: Using hands to direct manipulate computer graphics has been discussed for decades. The former settings require either instrumented gloves or equipments, which are cumbersome for designers. Some researchers also argue that the gestures are categorical, rather spatial [1, 2]. Due to the fact that the current real 3D displays are not able to interact with, I believe that projected images could resolve both issues. Since depth camera (e.g. Kinect) can capture the point clouds in real time, we could provide meaningful hands behavior for building models as well as utilize physical artefact to interact with the built computer models. 

 

[1] Norman, D. A. Natural user interfaces are not natural. interactions, 17, 3 2010), 6-10.

[2] Gustafson, S., Bierwirth, D. and Baudisch, P. Imaginary interfaces: spatial interaction with empty hands and without visual feedback. In Proc. UIST (New York, New York, USA, 2010). ACM, 3-12.

 

Here is our Wiki page, if you wanna join us and edit the page, please let me know.

 

Proposed by: Chih-Pin Hsiao (chipin01@gmail.com)

 

Students interested in the project:

 

1. Chih-Pin Hsiao

2. Ravi Karkar - rkarkar@gatech.edu

3. Jeremy Duvall 

4. Joshua Zeder

5. Megha Sandesh

 

Second team:

 

 

 

Project TitlePosture Recognition and Location Detection in Specific Contextual Environment

 

DescriptionThe traditional methods for analyzing complex human behavior are through coding video or audio recordings or through other embedded sensors on human bodies. The collection, refinement, and analysis of these data are time consuming and dependent upon the contextual environments. A set of depth cameras, such as Microsoft Kinect, will be mounted to a frame which encloses a space. Each Kinect outputs its points to a server where an algorithm is run to combine the points and then extract the human location in the space and skeletal bone systems. The skeletal bone system includes the relative locations of the key parts on human body and the orientation of the joints between each key part. Once we have these data, a context-aware system could be created and integrated into our living environment. For example, one proposed system could be deployed in senior living center for preventing falling issues. 

 

This project could be built in one of the bedroom in aware home.

 

Proposed by: Chih-Pin Hsiao (chipin01@gmail.com)

 

Students interested in the project:

 

1. Chih-Pin Hsiao

2. Megha Sandesh

3.

4. 

 


 

Project TitleClockReader System

 

Description: Clock drawing test is one of the simplest, but most commonly used screening tools to detect cognitive impairment. The person undergoing testing is asked to 1) draw a clock; 2) put all the numbers; 3) set the hands at ten after eleven. ClockReader System is a computerized Sketch-based clock drawing test. It can record and recognize a patient's freehand drawing data, automatically analyze the drawing, and report the score based on the scoring criteria. The current system can capture the stroke pressure, air time (pen up and pen done), and drawing sequence. It would be nice if we can deploy the system on the mobile devices so that doctors can test it anywhere.

 

Proposed by: Hyungsin Kim (hyungsin@gatech.edu)

 

Students interested in the project:

 

1.

2. 

3. 

4. 

 


 

Project TitleHelping Hand

 

Description: There are now 1100 returning soldiers from Iraq and Afghanistan with vision injuries [Zoroya, 2007]. By the year 2010 the number of Americans with visual impairment is projected to increase from 11.4 million recorded in the 2000 census to approximately 20 million, of whom more than 3 million will be veterans [De l'Aune, 2002]. For many of these visually impaired individuals, locating and identifying specific objects can be an arduous task, as these activities ideally require eye-hand coordination. To address this issue we propose the use of a Helping Hand: A RFID glove that can be used as an immersive tool to improve the hand?eye coordination skills of the visually impaired. This proposed research directly addresses VA’s mission by attending to the “compensation for loss” issues and “quality of life “mandates.

 

Proposed by: Marc A Lawson (marc.lawson@gatech.edu)

 

Students interested in the project:

 

1. Stephen Audy (ID)

2.

3. Connor Wakamo (cwakamo@gatech.edu)

4. 

 


 

Project Title: Textile -Component Connections

 

Description: Finding easy ways to make connections between textile interfaces and electronic components. "Mag Knot"

 

Proposed by: Clint Zeagler <clintzeagler@gmail.com>

 

Students interested in the project:

 

1. Andrea Del Risco(ID)

2. 

3. 

4. 

 


 

Project Title: Bluetooth or wireless components for textile interface

 

Description: Peripheral or computer networking

 

Proposed by: Clint Zeagler <clintzeagler@gmail.com>

 

Students interested in the project:

 

1.

2. 

3. 

4. 

 


 

Project Title: New Textile Interfaces combining embroidery, screen printing, and fabric manipulation

 

Description

 

Proposed by: Clint Zeagler <clintzeagler@gmail.com>

 

Students interested in the project:

 

1.

2. 

3. 

4. 

 


 

Project Title: Laser Printer Modeling Software

 

Description

 

Proposed by: Scott Gilliland

 

Students interested in the project:

 

1.

2. 

3. 

4.

 


 

Project Title: Electroluminescence

 

Description

 

Proposed by: Scott Gilliland

 

Students interested in the project:

 

1. Fred Leighton (MSDM / ID section) - fleighton3@gatech.edu

2. Patrick Mize (MSCS) - pmize3@gatech.edu

3. Spencer Border (MSCS) - sborder3@gatech.edu

4. Chris Bayruns (CS) - cbayruns@gatech.edu

5. Patricia Tait (ID) - ptait3@gatech.edu

6. Jon Pelc (HCI) - thisisjonpelc@gmail.com


 

Project Title: AwareHome

 

Description

 

Proposed by: Brian Jones

 

Students interested in the project:

 

1. Andrea Del Risco(ID)

2. Josh Fierstein

 

 


 

Project Title: AHA  - AwareHomeAwareness

 

Description: Imagine your home being intelligent enough to remind you that the stove is still on by gently pulsing a red light in the room you're in.  How about being able to tell immediately from your work room that you're son's been playing his xbox for 3 hours straight just by glancing over at a portion of your wall where a game icon is projected that progressively gets brighter and changes color the longer the system has been on? What about simple lighting that tells you if there's water running or an appliance consuming too much juice?

 

The AwareHomeAwareness (name subject to possible change) project seeks to develop an ambient display system (ambient lighting/projected icons) for homes that allows users to pre-attentively perceive information about the home from anywhere in the residence, augmenting the resident's awareness of the status of the home and many of its components along with general information of interest to the resident. The display can aim to solve common safety issues (fire, door locks), aid in overcoming addictions (video games, social media), encourage power saving (via visualization to increase awareness of power consumption), assist people with health concerns (reminder lights), or just improve home life experience in general (mail reminders, air conditioner/heater status visualization). The project can be taken in any direction the team sees important. If you are interested in HEALTH, SAFETY, ENERGY, SOCIAL ISSUES (addiction), GENERAL LIFE STYLE IMPROVEMENT, or any other possible area of interest that can be tied into this project, please do not hesitate to sign up.

 

AHA welcomes members with many different backgrounds.

- Human Computer Interaction

- Industrial Design

- Electrical Engineering 

- Computer Science

- Psychology

  etc

 

This project will provide a great opportunity for design people to show their creativity in conjunction with their technical awareness. This project will serve as a ground where Art & Design meets Technology.

This project will also require a great deal of prototyping in the physical realm. Fast crafty hands are greatly welcome as well as people with experience in some electrical engineering. 

 

Please direct any questions to Sam Lee.

 

Proposed by: Hyung-Min "Sam" Lee <h.lee@gatech.edu, logos50907@gmail.com>

 

Students interested in the project:

 

1. Hyung-Min "Sam" Lee (1st year MSCS: HCI) <h.lee@gatech.edu>

     BS CS in Media and Modeling Simulation. 

2. Oriol Collell Martin (oriolbcn@gatech.edu)

     BS in CS, focused on Software Development.

3. Hanna Schneider (hanna.d.schneider@gmail.com)

4. Dan Huang (dhuang@gatech.edu)

5. Shaun Wallace (swallace6@gatech.edu)

6. 

7..Shubhshankar M.S (sms6@gatech.edu)

8. Harrison Jones (harrisonhjones@gmail.com) - 4th yr ME/CS

Please feel free to keep adding names after 4. If the project team is big enough, it may be possible to divide into two or more groups, each tackling a different interest area. (One group handles health while another handles addiction etc)

 


Project Title: AHA  - AwareHomeAwareness - AwareFridge Project

Description: An aware refrigerator. It helps you keep track of expiration dates, suggests you what you could eat based on what's inside your fridge. Maybe it tells you what you have to buy, communicates that to your mobile device or with a food delivery service. We want to work closely together with the AHA team and we meet Fridays at 3pm in the Aware Home! Open to your creative input to invent the next generation fridge - which no home will want to miss in 2020!

 

We would warmly welcome another CS student as we have an ID student.

 

Students Interested:

  • Harrison Jones - Project Leader - 4th yr ME/CS - harrisonhjones@gmail.com
  • Hanna Schneider - hanna.d.schneider@gmail.com
  • Shaun Wallace (swallace6@gatech.edu)
  • Andrea Del Risco (ID)
  • Sean Swezey - sswezey@gatech.edu 

 


 

(THIS IS ONLY FOR STUDENTS TAKING ALSO THE HCI COURSE)

 

Project Title: N/A

 

Description: For students like me that are taking both this course and the HCI one (both with Abowd), we have an opportunity to do a common project with that course (I have checked with Abowd that this is feasible). This would give us the opportunity to foucs on one project instead of having to deal with two and, thus, instead of doing two great medium-complex projects, we could do a single awsome high-compex project. I was thinking of making a project on the theme of "home life improvement" which perfectly fits in both courses, and also gives us the opportunity to participate in the ACM Research Contest.

But, how the project would fit in both projects? We could start with an already defined idea of a product, then: For project 1 on this course and the first part of the HCI project we could, in parallel, build the product and think of a problem that the product does not solve and we want it to solve and explore it's possible implementations. Then on the second part of both project we could take one of the options we have considered to enhance the product and implement it.

I don't really have any specific idea yet because I want to see if there are more students interested in this first, but since this theme is widely-open and also is a current research trend, we won't have any problem to gather ideas.

 

Proposed by: Oriol Collell Martin (oriolbcn at gatech.edu)

 

Students interested in the project:

 

1. Oriol Collell Martin

2. Patrick Mize

3. 

4.

 


 

 

Project Title: Children's Long-distance Interactive Play System (CLIPS)

 

Description: Augmented Reality technology thus far has primarily been used to create single user experiences, where one user explores digital content overlaid upon their actual environment.  We propose to use the Argon platform to create a system that enables children in separate locations to play location-based games with one another in real-time, e.g. tag, capture the flag, etc.  Each child will be outfitted with a location aware device that will track their position and provide a display that shows their remote companions rendered into the local surroundings.  

 

Proposed by: Ed Anderson, Isaac Kulka, Josh Teal

 

Students interested in the project:

 

1. Ed Anderson

2. Isaac Kulka

3. Josh Teal

4. Hafez Rouzati

 

 


 

Project Title: Location-based profiling in the Aware-Home

 

Description: Looking to build some location-based profiles for any available appliances in the Aware-home( Air-conditioning, TV, Internet/Wi-Fi, Security System, Lights, Stove?). We all carry smart phones now, and since our locations are readily available it would make sense for our residence to be smart enough to turn off the lights, stove, A/C etc. when we leave. Simply being able to control these appliances remotely is even a first step. 

 

-With people's location becoming more and more prevalent with the widespread use of GPS technology on smart phones, these sorts of profiling situations become much more feasible. As a result, if this technology becomes standardized and integrated within our daily lives, it could drastically reduce energy consumption in the home. 

 

There were about 6 or 7 people at the Aware-Home meeting, and I only saw one project so far, so I thought I would put up another idea in case anyone is interested. 

 

Proposed by: Josh Fierstein (jfierstein@gatech.edu)

 

Students interested in the project:

 

1. Josh Fierstein

2. Cheng-Pin Robert Lee - robertlee42@gmail.com

3. Donovan Hatch - dhatch6@gatech.edu

4. Yoomi Pyo - ypyo3@gatech.edu

5. Ka Young

 

 

 

Project Title: Rapid ABC Test

 

Description: The Rapid ABC test or Rapid Attention Back and Forth Communication test is a 5 minute behavioral screening test for the early detection of Autism Spectrum Disorders in toddlers. Experts score five activities to test gesturing, attention level, body language and eye contact. These tests are being conducted in the Child Study Lab at HSI which is currently equipped with microphones, cameras and other sensors to record and monitor the participants of the test.

 

The project for this class would specifically address two key activities within this test which involve the use of a book and a ball. By exploring ways to embed sensors within these objects, the project aims to enhance the quality of data obtained from the test. For example, using multiple on-body sensors present in the environment and the sensors embedded within the ball, it could be possible to know the nature of the interactions with specific objects (who touched the ball or when was the page of the book flipped?).

 

This project offers an opportunity for students to collaborate with an established group on campus and assist with their on-going endeavors in the field of Autism research. It presents the possibility to work with sensors and also address some of the issues concerning tangible interactions with products.          

 

Proposed by: Dr. Gregory Abowd

 

Students interested in the project:

 

1. Durga Kudtarkar - durga@gatech.edu (ID)

2. Angela Mingione - angela.mingione@gmail.com (CS undergrad)

3. Jon Pelc - thisisjonpelc@gmail.com (HCI)

4. Akshay Goil (akshay.goil@gatech.edu)

 


 

Project Title: Music Match

 

Description: Our project revolves around collaborative music(think Grooveshark, Pandora, Last.fm, Turntable.fm).  Technology is driving the music experience away from a social setting and we aim to reverse that trend. We find it very difficult to share music between our friends and the discovery of new music is a tedious process.  Our app will be at the intersection of sharing and discovery and we plan on using a survey to come up with a necessary feature list.

The initial web app will be written using Ruby, JavaScript, HTML/CSS. We plan on continuing this project and releasing an iOS version and implementing extra features.

 

We are doing a joint project with the Mobile Apps and Services class taught by Russ Clark.

 

We are looking for another developer and a Web UI Designer.

 

Proposed by: Alex Samarchi (asamarchi3@mail.gatech.edu)

 

Students interested in the project:

 

1. Alex Samarchi

2. Pradyut Pokuri

3. Carter Templeton (c.templeton@gatech.edu)

4. Dylan Demyanek (ddemyanek3@gatech.edu)

5. Dan Huang (dhuang@gatech.edu)

 


 

Human Sensing/Activity Recognition:

                                                   The project aims at detecting human activity inside the home using the existing network infrastructure of wired and wireless networks. Wireless networks have become pervasive in today's world. Every home has multiple wired and wireless devices, which are present as cell phones,phones, laptops, set top boxes, gaming consoles etc. The project involves detecting human activity inside the home using network traffic data at the wireless router. I think there will be two parts of the projects:
1- It involves detection of all the devices inside the home and detection of human presence inside the home using network traffic data from the wireless router.
2- It involves tracking online activity to detect time spent interacting with the devices and time spent on different online applications.
I am a PhD student in networking therefore can help on networking side and wireless router stuff. But any set of expertise are welcome. I think its quite interesting and challenging project and can lead to something out of the class room. Please feel free to ask anything.
Email Bilal Anwer: bilal@gatech.edu

Students interested in the project:

             Bilal Anwer

 


 

Project Title: Wearable Technology

 

Description: It is not always feasible to pull out a cell phone to check for messages or new updates, particularly when the phone is being carried under heavy clothing or the situation is such that it would be inconvenient to take out a phone.  We are proposing a modular solution to this problem, by integrating ambient alerts to phone events, and deconstructing elements of the phone in order to use them in a more intuitive way.  The solution will be contained in one or multiple pieces of clothing and the individual components will be controlled by an app for Android devices.

 

Proposed by: Lauren Schmidt

 

Students interested in the project:

 

1. Lauren Schmidt

2. Hans Reichenbach

3. E.J. Layne

4. Stephen Joy

 

 

Comments (0)

You don't have permission to comment on this page.