Ladies and gentlemen, we are live with the hackathon!
On a superb summerish Saturday, we kicked off the 24 developing marathon at Casa de Cultura a Studentilor, having at the start of the competition more than 130 participants who eventually split in 29 active teams.
Vlad Ciurca, one of Techsylvania’s executive producers has taken first the stage, giving the introducing speech for all the attendance.
Next, Dan Romescu, our hacknitiator gave out the broader details of the hackathon, all the prizes available which can be found here, and a special thanks to all our hackathon partners and sponsors. As teams were anxious to speak out their mind and srart impressing from the beginning. And so they did!
So we have Team #1 called Flying Bastards who are using Leap Motion and a Sphero ball to make a controlling system for the latter using hand motions. Basically, they intend by using several hand gestures to be able and also create a pattern for the Sphero ball which can follow on its own.
Team #2 called Balls on Fire are developing a mobile app which can turn an image into analytical data and then the data into a set of instructions which can act as input for a Sphero Ball. Basically, they will draw a maze and take a picture of it. Turn that picture into a set of data which will be Sphero’s auto input. From that point on, with “no hands attached” the ball will be able to find its own way out of the maze.
Team #3, Macadamian Nutz, are all about Smart Agriculture. Thinking about tomorrow’s farmer, they are using Leap Motion, Electrical Imp and several sensors (temperature, humidity, proximity) to ease a farmer’s life when taking care of his crop. They will have an end user app through which the user can monitor the live activity of the crop and interact with it just by using hand gestures.
Team #4 is called Rescue Beacon and by using numerous Onyx Beacons and handheld devices, they are thinking about keeping large groups of people together. Basically, in a such a scenarion, each attendant will hold a beacon and the leader will be notified when someone is out of the indicated parameters within the group. Also, when using a web app they want to record the user’s pathway and through a tracking system they want to send auto updates (last known locations, area etc.) to an eventual rescuer when they user’s got lost.
Team #5 is called Shockwave and they’re up to revolutionizing the live social interaction with new people. Using a Droid app and smartwatch, you will be able to switch contact details with a new person when just shaking hands with them. Without location or GPS tracking, just by using an accelerometer and time stamps, they smartwatches will automatically pair when users shake hands more than a predefined amounts of seconds and then receive a pairing & import request on their smartwatches. Also, they will have a record of new recent contacts on their handheld device. They are the first team to develop on the Romanian made Vector Smartwatch.
Team #6 called UOM are using Google Glass for face recognition purposes at various networking and social events. For instance when you fill out the inscription form for a certain event, you will also add a few pictures of your face just by using you webcam. They will go into a DB where the Glass will take its data from and just when scanning for people in a large venue, you will be able to get their contact details. Kind of next gen, right?
Team #7 called Gestexpress are translating gestures into voice. They are specifically thinking of using their Leap Motion based technology for people with vocal handicap or recovering people who are having a hard time with motion gestures.
Team #8 is called Rift and by using a jumbo between Oculus Rift and Leap Motion they want to transform the coding process from data input using current classical devices to a more natural approach, just by using motion and interact with the objects, move them around (or libraries that is) physically in a Virtual Reality environment. They’re idea can also easily be shifted from Oculus to Google Cardboard and connect it with Leap Motion again.
Team #9 is called Attach and are planning to build software for a Moto360 watch so they can ease our interaction with powerpoint presentations. Basically, take out the need of a presenter and use motion gestures controlled by the smartwatch.
Team #10 is Gaggle are targeting a real social interaction at the end of using their wearable idea. Also using a Moto360, they are developing an app which helps you find best matches for various discussions based on your interests. They call it an “ice breaking app for smartwatches”. The interest list will be user defined when creating the profile after installing the app.
Team #11 is Skipper. They want to use Leap Motion to be able to control multiple CCTV or other types of recording cameras around. They are taking a live feed from several cameras and they want to interact with them just by using motion gestures.
Team #12 is called Beat the shape. They plan to really do so by combining a Withings smart scale with a smartwatch,Onyx beacons and a smartphone. Creating an experience for those interested in losing weight, the scale sends user data to smartwatch and for instance when you are thinking of having that extra snack in the middle of the night, other device connected with Onyx Beacons will be able to send you a reminder on the smartwatch that you shouldn’t. Imagine that the fridge could literally talk to you instead of just feeding you! 🙂
Team #13 is Smartcar. As the name suggests, they want to connect the car to your smartwatch and let the initial communicate with you when you are making driving mistakes for instance. Integrating a microcontroller and a Raspberry Pi into the car, they want to send motion signals (vibrations for instance) on your instance when you want to change lanes and forget to signal or when you fall asleep while driving. Safety comes first, right? They’re idea is scalable both on Apple Watch and Moto360. They are the first team from Techsylvania hackathon to use the Watch for their project.
Team #14 is called DY. They plan to replace physical paper with a web interface when playing boardgames for instance and you need to keep score. User can input the score on a web app and this will be connected to your Pebble device and just keep the score in a digital way. Save the paper!
Team #15 is Meepo. Using Leap Motion, they want to recreate the same motion based experience for presenters. By creating predefined gestures they want to take out the regular presenter and make real use of your hands when presenting.
Team #16 is called Android Monsters. They’re creating an indoor mapping systems. Let’s suppose you’re at the gym and you pre-load the fitness training program on your device. As each fitness machine from the gym has a beacon attached, a device like Garmin will show you the appropriate machine to use, guide you to that machine and send you notifications when your program is set to finish then redirect you to the next one and so on and so forth.
Team #17 is called Arobs. First aid kit app with short training lessons on CPR is their basic idea. Using a smartwatch as an input, the app aims to connect various smartwaches like Garmin or Pebble and offer you CPR rules on devices and motion notifications (for number of repeats, when to start/stop etc.) on the wearable when you are using CPR. Also when starting the procedure, the app is set to automatically call the emergency number and put the phone on speaker so that the rescuer is well guided.
Team #18 is called NoGPS. They’re idea is as simple and effective as one can imagine. They aim to detect indoor (and maybe outdoor as well) position without using GPS, only the gyroscope and accelerometer. They aim to scale the idea to every device which has a gyroscope and accelerometer integrated already and their first products of choice ar an iPhone and Metawear platform.
Team #19 called Troglobyte aims to manage a swarm of Parrot drones using a smartphone or laptop.
Team #20 is Race Evolution. Using Garmin Devices and later on scale to Google Glass they aim to send exact data to a participant in any type of race about position, time (in comparison to the front runner and the follower, the race leader) and maybe send signals to other participants when one of them is stopping or quits the race. Basically they aim to send relevant Real Time data to a race participant using smartwatches.
Team #21 is called Yopeso Team. The second team at the hackathon using an Apple Watch, they plan to integrate it with a car and send notifications for speed limits to the driver. Regardless if you’re driving too fast or to slow, the Apple Watch is there to send you push notification signals.
The list is currently updating. We still have more teams which are developing so keep following us throughout the day.
Team #22 is named Trencadis and they’re developing a web app for correcting the attention span. What they are doing, in a nutshell, connect the Eyetribe tracker with a desktop and with a Moto360. Sending all the data in cloud, the game changes its intensity depending on the user’s pulse which is being tracked by the smartwatch. They say you can’t beat the computer but who knows? Find the game at fokăs.com
Team #23 is !AFK. They have built a virtual audio mixer which can be controlled with the use of Leap Motion. You can change the volume, pitch and switch between tracks. Even some mixing is available so we guess some DJs will be around to showcase.
Team #24 is Pitech Plus and their project name is Finsig. We have some recollection of what happened last year becuase of their idea: to use hand gestures instead of a regular password. Control access system may be just one of the deployments of their project. For the motion control they’re using a Leap Motion.
Team #25 is using 3D printing to build a regular, mechanical watch from scratch with a battery that will last… no more than 2 hours.
Further edits to come.