In the first week of our project, we brainstormed the design for the chair, settling on a conventional office chair that we shall augment to house our electronics and contain the drive train for our motorized wheels. From there, we began researching the specifics or electroencephalography (EEG) technology to better understand what device we should use and how to interface with it.
Having settled on our project, our next step is to identify an EEG device within our budget that would allow allow us to differentiate brain activity to create 4 distinct outputs.
We met with a team that had conducted a similar project utilizing an EEG to discuss the challenges they faced in order to effectively use the device and prevent any issues from arising. Additionally, they allowed us to borrow their 12-channel EEG device for the duration of our project.
Our goal for the next week is to begin testing the device to familiarize with assigning brain activity to different actions.
We obtained our brain-computer interface (the EEG device), and began experimenting with it to see how it works and how we can effectively use it. We tested it on multiple people to determine how much of an impact hair volume has on the signal strength of the EEG.
Having now gained an understanding of how to use an EEG, we plan to begin experimenting with Node-RED, the software application which we'll be using to interface with the device.
Due to some technical difficulties, we were unable to progress too far, as we discovered that Node-RED and its underlying components were unable to run on any of the computers we currently have available. We spent the week researching alternatives and attempting to get around the incompatibilities, however were unable to come to any reasonable solution. As a result, we will have to wait for Adam to return from his research program at UMass Amherst and use his computer, which is compatible with the required software.
Our plan for next week, until Adam returns, is to begin constructing the physical parts of the chair, such as the housing for the motors, batteries, and other electronics.
With digital work on hold, we began planning the physical part of our project: the chair. In order to house the electronics and make the chair driveable, we must modify our office chair to have a custom support frame capable of both housing the required components and supporting the chair. As a result, we'll be removing the existing swivel base of the chair and replacing it with a wooden cube frame with a plywood surface to support wiring and secure the Raspberry Pi. We'll be using dimensional lumber and plywood as a proof of concept due to it's simplicity of use in construction.
We started the construction of the cube frame, including the purchasing of the needed lumber, and began assembling the base of the cube. This was just a proof of concept and we made a more precise wooden frame after.
As the week continues we plan to continue the construction of the cube, and when Adam returns next week, begin working on the code for Node-RED.
Now that Adam has returned, we were successfully able to install and setup Node-RED. Following that, we were able to successfully connect the EEG to Node-RED and began researching how we could interpret the data received from the EEG. After some brief research, we were able to construct a node pathway that currently allows different programmable actions to take place based on the data interpreted from the EEG.
Next week, we hope to obtain a Raspberry Pi to begin the process of constructing the electronic components of the chair, and moving this project into a more physical testing state.
This week we were able to obtain the two DC motors and battery allowing the chair to move forward and back. Furthermore, we obtained the chair and built the applicable portions of it including the back and the cushion.
Looking ahead, we decided to divide and conquer so that our focus was on assembling the chair and its electrical components and get the BCI up and running in our future meetings. This plan consists of retrieving a motor shield which will allow us to control the individual motor speeds and direction, further work on getting Node-Red up and running, increasing the accuracy of the EEG's results through the testing of various conditions and circumstances for our desired commands, and further improving on the idea and design of the chair's bottom.
Next week, we hope to start or be in better position to assemble Nüromotus.
Continuing to build Nüromotus in week 8, we focused on finishing the foundational elements of the chair. Such elements included establishing the baseline wiring of the drive train and sanding and painting the body of the chair. Next, we started focusing on interfacing with the EMOTIV EEG headset. After some brief testing and reading of documentation, the original plan to utilize Node-RED was abandoned, instead favoring a large plain-text codebase written in Node.js. Node.js was chosen as the runtime environment due to our familiarity in it, it's strong performance, and it's strong versatility in a variety in use cases.
Next week, we aim to finish painting the chair and be able to control turning on and off a LED with our mind using our BCI setup and adding the eletrical components to the wheelchair.
This week we 3D printed wheel to motor adapter and spray-painted our wheels a shinny silver. We also began writing code to interface with the EEG headset using Emotiv's API, Cortex. We discovered early on that the Raspberry Pi does not have reliable compatibility or performance with Emotiv's software products. Therefore, we shifted over to utilizing a separate laptop to connect to the headset and process EEG signals, with this laptop wirelessly sending the PWM motor controls to the Pi over a WiFi connection.
This week focused on writing the code to connect to the Cortex API and begin processing the mental commands. So far, our code is able to connect to the API, authenticate itself, connect to the headset, and begin receiving data. We also began testing different methods of interfacing with the output pins on the Pi, determining how we control devices such as LEDs and motors.
This week we focused on getting the parts of our circuit on the chair and the 3D printed components for the wheels, motors, and the microcontroller and battery on the chair. We added a foot rest to the chair. The footrest has some give to it, so we will reinforce it next week. We also added some lights on the undercarriage to bring some style to the chair.
When we tested our motors with the wheels attached, we found that they would only start to spin at over 50% power. We started troubleshooting what the potential sources of error, checking if our motors were stripped, if the wiring was faulty, and our battery; ultimately, we concluded that our motor shield was severely lacking the required amperage needed for the motors.
We also completed the BCI code, allowing us to interpret mental commands from the EEG headset. As of now, there are no actions associated with these commands, as we are yet to connect the BCI software with the motor controller software. Before we connect the two, we need to ensure both are able to function properly. As such, we also began writing the necessary scripts for the motor controller, including speed control, safety limits, and implementing a standard protocol for us to wirelessly send commands to the motor controller.
We were able to use our EEG setup to turn on and off a LED. We still need to train our interpreter AI for the brain waves to get more accuracy with turning on the light, but this is great progress!
This week we tested our wheel-motor adapter by sitting on the chair and found that prolonged seating sessions results in the complete failure of the structural integrity of the wheel-motor adapter. We designed a new adapter, changing the infill to 100%. The new design could withstand substantial weight and continuous usage of the chair. Additionally, we reinforced the connection between the foot rest and the rest of the chair with metal brackets to make it fully stable.
Next week we aim to research and acquire new motor drivers for the wheelchair.
Our school had a First Robots team 8 years ago but due to budget cuts the program was discontinued. We got access to the robot and stripped it for its motors and its Talon SR motor controllers. The robot had two motors geared up together with a speed reduction gearbox on both sides.
Pablo and Adam worked on modifying the body of the chair to fit the new motors and gear boxes from the First Robotics robot. Side mounts were cut and screwed into the body of the wheelchair to add extra faceting points for the motor complex. Our motor-wheel adapter didn't fit with the gear box so we made a new one. Pablo designed a 3D print to hold the gear box of the motors to the side of the wooden frame. The foot rest was shifted forwards to leave room for the motors. Additionally, our logo was added to the foot rest. Additional chunks of wood were screwed into the undercarriage to provide more support to the foot rest.
We continued to refine our API and code as well.
The Talon SR motor controllers from the First Robotics robot were stripped and added to the wheelchair. These motor drivers can operate at 60 amps continuously but only have one output channel each. This is much better than our previous motor controller, the L298N motor driver, which had dual 2 amps outputs. We ended up using 4 Talon SR motor drivers. Adam soldered the Raspberry Pi, motor drivers, battery, and motors as well as adding power switches on the side of the robot to control both batteries. Terminal connectors were used to easily be able to separate different components from each other.
Griffin worked on modifying the code on the Raspberry Pi PWM pins to work with the new motor drivers which operates with pulse widths between 1.0-2.0ms (with each extreme being full speed in one direction) with a neutral pulse of 1.5ms. This timing mechanism differs from conventional PWM control, and combined with the addition of two more motors, a full refactoring of the motor controller's code was required. Thankfully, due to the highly modular design of the code, it wasn't too difficult to swap out the old code for the new code.
We finally tested the new complete motor set up. We didn't get a response from the motors at first when simulating a BCI forwards command and had to trouble-shoot the problem. Errors in code and wiring were addressed to get the motors operational. We found out that using the same battery pack to power the Raspberry Pi and lights caused a voltage shortage for the Raspberry Pi, resulting in the motors sporadically moving during neutral signals. We got a second battery pack dedicated for the lights and made a 3D printed mount to attach it to the body of the chair. We got full control of all four motors from having WASD keyboard inputs simulate EEG data for neutral position, forwards, turning, and backwards.
The actual EEG headset that we are using has some random noise even with optimal training, so we added noise reduction to our code. This adds some latency but makes for a much smoother ride.
We started driving the robot around using simulated EEG data from a keyboard. The motors for the right wheel were outputting more force from the same PWM signal, so we modified the code to compensate the speed difference for the two sides. We modified the motor controller to operate on fixed speeds instead of variable speeds; this means a drive command prompts the wheelchair to smoothly accelerate and decelerate to predefined speeds depending on the command issued. We can precisely drive the wheelchair around now with the laptop keyboard.
Additionally, we found that our battery life was really low, partially due to the battery's age as well as its low Ampere-hour rating of 7.8Ah. We calculated runtime and found it was only around 10 minutes long, so we started researching batteries with higher Ampere-hour and settled with the Continental CB12260-NB, which would give us around an hour of runtime. After the addition of the larger battery, we fully finished our wheel chair and code and only need to train our headset.
With the physical portion of our project fully completed and all the coding finished, the only thing left to do was train our EEG. Our program allows multiple EEG profiles, but Griffin is our primary training subject. We doused his head in a saline solution to improve the electrodes' connection to his head and started training different commands including standby, forwards, backwards, turn right, and turn left. Training the EEG to operate the wheelchair is like learning how to use a third arm that just sprouted out of the body. We are starting to get a very consistent neutral and forward state but are still training backwards, turn right, and turn left.
To train the BCI, Griffin would put all of his focus into thinking about a specific action, such as moving forwards or backwards. The AI would analyze the outputs from each of the headset's 12 electrodes' readings of the brain waves and automatically associate those distinctive outputs with the direction output being trained. While our chair is intended for someone who has some level of paralysis or reduced motor function, to reduce the amount of training data we need to record for the chair, we started performing hand signals with our thoughts, such as pointing forwards to move forwards or pointing to the left to turn left.