On Monday, the 11th of november, a group of neuroscientific computing enthusiasts gathered in the Fortiss Research Institute, with a beautiful view of Munich and the Alps, and until the friday of the same week, worked day and night on completing tasks focusing on topics ranging from Spiking Neural Networks (SNNs) to Event-Based Datasets. Inspired by the story-like view, each team put all their effort into expanding on priorly-worked on research projects. These were provided by Fortiss, NeuroTUM, IBM Research Center and last but certainly not least a deep-tech start-up Neurobus.
What makes our neuromorphic hackathon special is that it is research-focused, where a team gets tasked with a specific topic and they are accompanied by a mentor, coming from different companies throughout the event. Amongst the Fortiss-led teams were ‘Binocular Dapt Estimation’ and ‘Time-to-First-Spike’, the last with the supervision of the IBM Researcher Stanislaw Wozniak. Another team, lead by our own neuroTUM members, that focused on ‘Spiking Vision Transformers.’. What’s more, we of course have the Neurobus ‘SPADES’ team.
On the first day, the participants were welcome to listen to talks from different speakers in the field. This inspired them to choose what groups they were going to join.
The different teams got together during lunch break and went out to grab a treat. They also had the opportunity to play kicker, whenever they needed some inspiration outside of the screen. This allowed the teams to discuss what has been done, exchange ideas, as well as just breathe in fresh air. Even though everyone was busy working on their projects, such small pleasures made them even more motivated to delve deep into developing different SNNs architectures.
Speaking of goals, we talked to one of the participants to learn more about her experience at the event. Katya mentioned that her goal was to ‘learn more about SNNs.’ In order to reach this goal, she decided to join the Time-to-First-Spike Team. Stan, an IBM researcher based in Zürich, visited us in Munich on Monday, shared his knowledge and provided the team with a research paper and a git repository based on high performance deep SNNs. They had the goal of reducing the latency in the introduced time-to-first-spike neural network as SNNs are incredibly energy efficient making them a great fit for various hardware implementations, such as self-driving cars, as they require a fast response rate to things such as traffic signs. The team was successful at lowering the working-time of the algorithm. Together with the ideas from Fortiss supervisors Michael and Beste, the team comprising of Paul, Aaron, Kai, Giuseppe and Katya, expanded on the topic, added their own spark and delivered promising results.
They were not the only team with achievements. The team Binocular Depth Estimation was tasked with calculating the depth values at a high speed using 2D images from two event based cameras, that represent human eyes.
To understand this better, it’s good to try out a thought experiment. Try placing your index finger in front of your nose, then close one of your eyes, then open and close the other. Now move your index finger further apart from your nose and repeat the process. As you can observe, each eye was better at correctly understanding where your finger was the further away it was from you. This played a big role in tackling the problem at hand. The two 2D images were used to calculate depth, using the disparity between the two cameras. Using this concept with the two cameras, the team started with a speed of 08.14 seconds and ended with one that is 00.86 seconds, which does not sound bad at all. Such a speed was gained by taking a different approach as per usual, and utilizing the event based cameras priorly mentioned. So the team consisting of Denis, Vivien, Mert, Mohammed and Jackey, used the algorithms which process events once they occur instead of frame by frame processing, meaning, they are faster at processing new information. This allows for high temporal resolutions at a microsecond precision.
As for the Spiking Vision Transformer team, they were tasked with learning how one can implement a bio-inspired transformer variant for gesture classification on an event-based dataset. After taking time to understand the task at hand, the team decided on two different possible ways of answering the problem statement: bottom up and top-down. This means that the team worked on making the spiking ViT code given to them simpler and (bottom up code explanation)As sadly, per usual in research, the results that were gained did not yield happy scientists. In fact, they showed that simplifying the system decreased its accuracy. This counts for both the ways of tackling the task at hand. Meaning that taking the approach of simplifying the transistor made the system worse at doing its job. It must be noted though that the team reported having issues with their hardware, stating that it’s most likely the reason for the unsuccessful results. Nevertheless, without such an experience it would have been difficult to predict what would happen if we did this, so it’s good that we used the hackathon to move forward!
A team that moved far was the SPADES team, they were tasked with using Brainchip neuromorphic hardware to estimate the future precise position of spacecrafts based on event-based data from the past. After a selective voting process, where both the mentors as well as student representatives voiced their opinions, the team of Roua, Jakub, Manoj and Jost won the challenge! Their team was most successful at hardware implementation onto the AKIDA chip and did wonderfully at predicting spacecraft pose based on the SPADES Dataset using Event Sensing.
Many opportunities were gained during this event. For example, the team at Fortiss presented the groups with possible thesis projects and or HiWi positions for the following semesters. Not only this, but the Fortiss-provided venue allowed the participants to observe what the research groups worked on in their daily lives. This was not the only extent of Fortiss's goodness, the neuromorphic research group at the company provided the teams with tote bags, snacks, coffee, tea and of course a wonderful lunch at the end of the event. The neuroTUM team made sure everyone was provided with a tote bag, which will be a nice addition to people’s commute in the U-bahn.
The question to ask now is, will there be another such neuromorphic hackathon? If so, are you, dear reader, going to take part? To get such information, feel free to follow us on Instagram, or subscribe to our newsletter!