Hackathons have evolved over the last number of years as participants have gotten use to the event style and come to events much more prepared for the intense discussions, ideation sessions, and coding. The core driver for hackathon attendance, thankfully, remains the same: developers are looking for a community of like-minded individuals, who are social, collaborative and open-minded to new ideas. And boy did the developers deliver on new ideas for our recent CloudXR Hackathon hosted by Hubraum in Berlin on October 19th.
This developer challenge was focused on the intersection between “extended reality” and edge compute. Nreal, Vive, Vuzix and Wacom supplied developers with a significant amount of hardware for the challenge. To get the idea juices flowing early, we ran an idea challenge ahead of the hack and Kira Vinogradova https://twitter.com/leonardoopitz Leonardo Opitz came up with the two winning submissions, each receiving $250€. Kira’s idea was an AR fitting room and Leo’s idea was real-time parking space detection via busses. We also kicked off the weekend with an ideation session and technical walkthrough that helped the teams focus the unique selling proposition of their solution.
One of the teams created a virtualized wah-wah pedal. What is this device you ask? It is the pedal a guitarist steps on to change the sound of the guitar. The brilliance in this project lies in the simplicity as well as the usefulness of the app. Instead of a guitarist having to be stuck to a fixed location because of the physical wah pedal or a speaker, they can now wander around the stage in a much more entertaining and engaged fashion while being able to control the effects pedal while anywhere onstage.
As for the winners of the MobiledgeX prizes, Team ARBoard (23) took our first place prize of 2000€ and 2nd place was taken by Team InstaPark (14). Team ARBoard created an AR Wah-Wah Pedal. Guitarists can now walk anywhere on the stage and still access their pedals without having to a specific location on the stage. This dramatically increased their freedom of movement and artistic expression. Team InstaPark created a machine learning app that would automatically spot open parking spots in real time by leveraging smartphones on buses, trams, and e-scooters. Overall, the winning teams are as follows:
1st place -- Team ARpolis (5) -- Visualizing the future outlook of cities using AR. Every person will be able to see the future of their district on their smartphone. Our technology is based on a combination of Google Street View and computer vision with high precision up to 10cm. Neural networks and cloud computing allow segmentation of trees, cars, pedestrians, placing them in front for the most realistic AR experience possible
2nd place -- Team Where-AR-we (26) -- A real time event information and location sharing platform with indoor navigation.
3rd prize -- Team Speech2AR (14) -- Real time speech to text translation via the Vuzix AR-glasses essentially letting the user observe subtitles with different colouring corresponding to different speakers and their emotions. The solution analyzes the voice and its different properties such as tonality, pitch & frequency using machine learning algorithms.