This outing is part of a larger music technology research project. The objective is to find a way to enhance music using hardware and software. This is the documentation for the Whimsical first part of the research project: it's an android app that detects a wearer's footfalls by running live inference on an LSTM. The system works by getting data from an Mbient Labs IMU to a mobile app over bluetooth. After you move the .csv file to a computer with a GPU, you can use the python code to train an LSTM on that data. You then export the LSTM to the android app and can begin detecting footfalls. Feel free to download and experiment with the code. It's meant to be read and improved upon by you and your LLM codewriter of choice! https://github.com/willbjames/iolawalker
翻译:本研究是一项大型音乐技术研究项目的一部分,旨在探索通过硬件与软件相结合的方式增强音乐表现力。本文档记录了该项目第一阶段的研究成果:一款基于LSTM实时推理检测穿戴者脚步的Android应用程序。该系统通过蓝牙从Mbient Labs IMU传感器获取数据并传输至移动应用。将生成的.csv文件传输至配备GPU的计算机后,可使用提供的Python代码基于该数据训练LSTM模型。训练完成的LSTM模型可导出至Android应用程序,随即启动脚步检测功能。欢迎下载并实验相关代码,本项目旨在供研究者及其选用的LLM代码助手阅读与改进!项目地址:https://github.com/willbjames/iolawalker