RESEARCH

Block-Based Brain Computer Interface Software

In this project, we are exploring block-based programming environment capable of analyzing real-time electroencephalography (EEG) data that enables users to quickly develop neurofeedback applications. This work is motivated by insight presented in Brain-Computer Interface (BCI) and visual languages literature.

Facilitating Human-Robot Interaction Evaluation with Neurophysiological Visualizations

This project explores links between users’ cognitive state and spatial information from robots. Cognitive state data is interpreted from EEG signals while users communicate commands to robots during navigation tasks. This work presents a step towards visualizing position-based measurements of users’ cognitive state during interactions between humans and robots.

WebBCI

Software environments used to create current neurotechnologies are often designed for developers experienced with languages such as Matlab, C++, Java, and Python. This research investigates the feasibility of JavaScript as a development platform for non-critical BCI systems. WebBCI is a JavaScript library that provides the basic tools necessary to run a BCI system entirely within a web browser. WebBCI builds upon existing JavaScript mathematical libraries such as Math.js and Numeric JavaScript, adding BCI-specific paradigms such as common spatial pattern (CSP), machine learning tools such as linear discriminant analysis (LDA), and signal processing methods such as power spectral density (PSD) and band power extraction.

Artistic BCI

Science, technology, engineering, and mathematics (STEM) is rapidly transitioning to STEAM, which integrates art, design, and science. This approach promotes creativity in traditional engineering fields. Our work explores approaches toward creating artistic BCI applications. We recently developed NeuroBrush, a multi-modal, web-based application that allows users to paint abstract art competitively. The EEG-based painting application features a brush that dynamically changes in size and color in response to a user’s cognitive state. The goal of this research is to expand on prior artistic BCI work by integrating gamification, art, multiparty, and multimodal BCI. This project aims to explore how combining these approaches could influence BCI-based self-regulation training.

Brain-Drone Race

The Brain-Drone Race is a competition featuring users' cognitive ability and mental endurance. During this event competitors are required to out-focus opponents in a drone drag race influenced by electrical signals emitted from the brain. On April 16, 2016, 16 participants competed using the Emotiv insight headsets and DJI Phantom 2 drones. This event was featured in over 800 news outlets including Discovery, USA Today, and the New York Times. In this project, we are exploring methods to leverage real-time communication frameworks to design interactive EEG-based multiparty experiences. This work also seeks to expand awareness of emerging EEG technologies by inspiring future designers and engineers of neurotechnologies.

Team

Chris Crawford headshot

Chris Crawford

Director

Nick headshot

Nick

PhD Student

Amanda headshot

Amanda

PhD Student

Bryan headshot

Bryan

PhD Student

Jenna headshot

Jenna

MS Student

Pierce headshot

Pierce

Undergraduate Student

Ajay headshot

Ajay

Undergraduate Student

Will headshot

Will

Undergraduate Student

Jay headshot

Jay

Undergraduate Student

Shomari headshot

Shomari

Undergraduate Student

Publications

Stegman, P., Crawford, C.S., and Gray, J., (2018). WebBCI: An Electroencephalography Toolkit Built on Modern Web Technologies. HCI International 2018, July 15-20, 2018, Las Vegas, NV, USA. Status: Accepted

Cioli, N., Holloman, A., and Crawford, C., (2018). NeuroBrush: An Artistic Multi-Modal, Interactive Painting Competition. CHI 18' Artistic BCI Workshop, April 22, 2018, Montreal, QC, Canada. Status: Accepted

Crawford, C.S., Andujar, M., and Gilbert, J.E., (2018). Brain Computer Interface for Novice Programmers. ACM SIGCSE Technical Symposium on Computer Science Education, February 21-24, 2018, Baltimore, MA, USA, pp. 32 -37.

Crawford, C.S., Andujar, M., and Gilbert, J.E., (2017). Neurophysiological Heat Maps for Human-Robot Interaction Evaluation. In Proceedings of 2017 AAAI Fall Symposium Series: Artificial Intelligence for Human-Robot Interaction AAAI Technical Report FS-17-01, November 9-11, 2017, Arlington, VA, USA, pp. 90-93.

Lieblein, R., Hunter, C., Garcia, S., Andujar, M., Crawford, C. S., & Gilbert, J. E. (2017). NeuroSnap: Expressing the User’s Affective State with Facial Filters. In International Conference on Augmented Cognition (pp. 345-353). Springer, Cham.

Crawford, C.S., Andujar, M., Jackson, F., Applyrs, I., & Gilbert, J.E. (2016). Using a Visual Programing Language to Interact with Visualizations of Electroencephalography Signals. In Proceedings of the 2016 American Society for Engineering Education Southeastern Section (ASEE SE), Tuscaloosa, AL, March 13-15, 2016.

Andujar, M., Crawford, C. S., Nijholt, A., Jackson, F., & Gilbert, J. E. (2015). Artistic brain-computer interfaces: the expression and stimulation of the user’s affective state. Brain-Computer Interfaces, 2(2-3), pp. 60–69.

Crawford, C.S., Badea, C., Bailey, S.W., & Gilbert, J.E. (2015). Using Cr-Y Components to Detect Tongue Protrusion Gestures. In Proceedings of the 33rd Annual ACM CHI 2015 Conference Extended Abstracts, pp. 1331-1336, Seoul, Republic of Korea, April 18-23, 2015.

Crawford, C.S. & Gilbert, J.E. (2015). Towards Analyzing Cooperative Brain-Robot Interfaces Through Affective and Subjective Data. In Proceedings of the 10th Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts pp. 231-232. 2015.

Crawford, C.S., Andujar, M., Jackson, F., Remy, S., & Gilbert, J.E. (2015). User Experience Evaluation Towards Cooperative Brain-Robot Interaction. In Proceedings 17th International Conference Human-Computer Interaction: Design and Evaluation, HCI International 2015, pp. 184–193, Los Angeles, CA, August 2-7, 2015, M. Kurosu (Ed.): Human-Computer Interaction, Part I, Springer LNCS 9169, DOI: 10.1007/978-3-319-20901-2_17.

Crawford, C.S., Andujar M., Remy S., & Gilbert, J.E. (2014). Cloud Infrastructure for Mind-Machine Interface. In Proceedings on the International Conference on Artificial Intelligence (ICAI), pp. 127-133.