2016 Projects

We really appreciated all of the hard work, time, and dedication put in by all of the teams during the hackathon! Here's a list of projects from 2016:

Team Colleen (1st place): Colleen sometimes falls when getting out of her wheelchair and has difficulty getting back up. She needs to be seated in order to transfer to her walker and then back into her chair. WThe team designed a system that she could move herself to that provides a lift assist to get her up high enough to stand back up with the assistance of a walker. A scale embedded into the seat of the power lift chair also provides weight information which is also difficult to obtain without a special scale meant for a wheelchair. This information is sent to an online interface for easy visualization and tracking.


Team Colleen preparing to install the motor for the lift


Team Jae (2nd place): The team built a customized, foldable cane attachment for a visually impaired wheelchair user to help navigate without getting in the way of a guide dog. The cane can be folded into multiple configuations for easy storage on publication transporation, or going through a doorway with magnetlic links. There are two grip positions for the cane to maximize comfort.


Team Jae prototyping the gimble for the cane


Team Megan (3rd place): The team worked with a teenager who does not communicate verbally. They made a storybook for her that she can show people to introduce herself, and a platform that allows her to touch other people on the platform and feel music bass as a result (she loves music).

Team Jonathan (Honorable Mention): The team made an app for a blind user to tell if the lights are turned on or off. The app emits sound and the frequency of the sound depends on the intensity of the lights. Jonathan left the hackathon with a trial version of the app after testing it all day. The app, later named Boop, was released on July 26, 2016, on the iTune Store. You can also find it on its website for future updates and features: http://arii.github.io/boop/.


Team Jonathan testing the android application


Team Barbara (Honorable Mention): The team made a live-editable peer-based interface for voice to speech transcription. Anyone in a room can see the transcription and fix it and the changes are viewable to everyone else immediately. !

Team Tayjus: A mechanism to press elevator buttons for a wheelchair using consisting of a robotic arm with three degrees of freedom that would be able to press elevator buttons, accessible door buttons, and potentially scan his student ID. The mechanism would be mounted onto his wheelchair and controlled by a system of arcade-like buttons that would be easy for him to navigate.

Team Stephen: The team self-stabilizing pen that could help someone with tremors write more easily. Our model right now uses mechanical dampening and an accelerometer that can detect tremors and cancel them using a motor.

Team Matt: The team worked on a web interface to help people with disabilities navigate the MIT campus. MIT's current accessibility map condenses the locations of amenities on different floors by representing all of them as icons on a single plane. Their goal was to develop a website that displays a dynamic floor-by-floor, building-by-building accessibility map of the MIT campus. In contrast to MIT's accessibility map, their application MapDeduce allows users to see which amenities are available on a specific floor in a given building, as well as notes about each individual amenity (e.g. which floors a particular elevator services). These features will allow people with disabilities to more easily navigate around the MIT campus.

Team Taylor: The teams designed a pill bottle opener for an MIT student with limited grip strength. The openers allowed the client to open pill bottles during the hackathon

Team Jeffrey:The team worked on a laser mail detection system that notifies a blind user whether mail has entered their mailbox. The product uses radio communication to notify the user at home, through sound and light, when an object has obstructed the path of the lasers in their mailbox.

Team Alan: The team worked on an app-based interface that could press buttons for a user with limited mobility by identifying and detecting the button of interest, and actuating a robotic arm to press the button

Team Karthik: The team worked on increasing accessibility of computer programs for visual impaired users. In particular, they coded programs for displaying weather, world time, exchange rates, and calendars in a visually accessible manner, and customized Karthik's text editor to highlight text and braces with higher visual contrast. Their code is open-source and is hosted on Github here.

Team Kate: The teams worked on alarm systems for a user with hearing impairment. The systems were wearable and provided light stimuli at a pre-set time, awaking the user

Team Mateo: The team worked on modifying drum sticks so that they could be used independently by a small child with mobility limitations on his arm movements. They also made a drum-stand for a wheelchair so that he can play the drums without help

Team Lisa: The team worked on designing an ergonomic foot mouse that can be used long-term without discomfort

Thank you again to all of the teams and clients that particpated last year! We greatly appreciate your time and efforts and we look forward to seeing you again this year!


Team creates quick prototypes to test ideas like a grabber



Team machines parts with the guidance of Lincoln Lab volunteers


Sponsors

Contact

If you are interested in participating as a student or client, we would love to hear from you at at-hack-core@mit.edu!
Mail | Facebook