Equipment

Diploma Thesis Opportunity: Brain-Computer Interface Smartwatch Integration

The SIPPRE Research Group is pleased to announce a new diploma thesis position focusing on the integration of Brain-Computer Interfaces (BCI) with consumer-grade smartwatches.

Thesis Overview

The project aims to develop a real-time brain signal monitoring system that streams processed EEG data from BCI devices to consumer smartwatches. The system will provide immediate biometric feedback, such as frequency band power visualization (Alpha, Beta, Theta), through an intuitive wearable interface.

Hardware Platforms

  • BCI Devices: EmotiBit, OpenBCI Cyton/Ganglion, or Enophone

  • Smartwatches: Google Pixel Watch 2 or Samsung Galaxy Watch7 (provided)

System Design

A hybrid architecture will be explored, involving:

  • Signal acquisition via BCI hardware

  • Real-time processing (filtering, FFT, band power extraction) using the Brainflow framework

  • Data distribution through an Android application

  • Wear OS visualization on a smartwatch with custom UI for live feedback

An alternative, mobile-only implementation will also be considered to eliminate the PC dependency.

Research Goals

  • Demonstrate feasibility of BCI-smartwatch integration

  • Evaluate user experience and interface design for biometric feedback

  • Benchmark system latency and reliability

  • Explore applications in meditation, sleep analysis, focus training, and cognitive load assessment.

Expected Outcomes

  • A fully functional prototype system (open-source)

  • Benchmarks on performance and latency

  • User interface design guidelines for wearable biometric displays

  • Academic contributions on the use of consumer wearables in BCI research

Innovation Potential

This project bridges professional-grade BCI technology with everyday consumer wearables, paving the way for more accessible neurotechnology in wellness and cognitive training applications.

Diploma Thesis Opportunity: Brain-Computer Interface Smartwatch Integration Read More »

Diploma Thesis Opportunity: Multimodal Emotion and Gameplay Analysis with PlayStation 4

The SIPPRE Research Group invites applications for a Diploma Thesis in the exciting field of multimodal analysis of player experience in gaming environments.

The project aims to design, implement, and evaluate a comprehensive experimental platform that records and analyzes the physiological and emotional responses of volunteers while playing on the PlayStation 4. Two contrasting genres will be studied:

  • 🧩 Brain/cognitive games that require focus and strategic thinking.

  • Action games that emphasize speed, reflexes, and intense engagement.

By comparing these categories, the thesis will investigate how different gameplay conditions affect the brain, the body, and the emotions of players.

Multimodal Data Sources

The student will integrate and synchronize diverse signals, including:

  • 🎮 Gameplay Recording via Elgato Cam Link.

  • 🧠 EEG (OpenBCI) to monitor brain activity.

  • ❤️ Physiological Signals (EmotiBit) – heart rate, GSR, SpO₂.

  • 👀 Eye Tracking to measure focus and attention shifts.

  • 😊 Facial Emotion Recognition from a webcam.

  • 🎛️ Controller Interaction Logging using a custom Arduino or Raspberry Pi device to capture button presses in real time.

Student Role & Responsibilities

The candidate will:

  • Set up the full experimental environment and ensure precise synchronization of all devices.

  • Recruit and manage volunteers for controlled gaming experiments.

  • Implement data fusion across modalities (EEG, physiology, gaze, face, gameplay events).

  • Conduct emotion recognition and correlation analysis across different game types.

Applications & Research Impact

This work directly contributes to our ongoing SIPPRE research on Dynamic Difficulty Adjustment (DDA), where game difficulty adapts to the player’s state. Insights from this thesis could enable:

  • 🎮 Adaptive Gaming Systems that adjust difficulty in real time based on stress, engagement, or fatigue.

  • 🧠 Cognitive Workload Monitoring for training or educational games.

  • eSports Analytics to study performance under pressure.

  • 🩺 Mental Health and Stress Research using games as experimental environments.

What the Student Will Gain

  • Experience with state-of-the-art multimodal recording and synchronization (EEG, EmotiBit, eye tracking, video capture).

  • Skills in signal processing, machine learning, and human–computer interaction.

  • Training in experimental design and data collection with human participants.

  • A project at the intersection of gaming, neuroscience, and AI, with potential for scientific publications.

This is a unique opportunity for a motivated student to combine gaming, biosignals, and adaptive AI to shape the future of interactive systems.

📩 Interested candidates should contact Assoc. Prof. Athanasios Koutras for further details.

Diploma Thesis Opportunity: Multimodal Emotion and Gameplay Analysis with PlayStation 4 Read More »

Open Diploma Thesis Position: VR & Brain-Computer Interfaces with Meta Quest 3S

The SIPPRE Research Group is excited to announce an open position for a Diploma Thesis in the cutting-edge area of Virtual Reality (VR) and Brain-Computer Interfaces (BCI).

Using our newly acquired Meta Quest 3S headset, the selected student will develop an innovative VR application that interacts with brain (EEG) or muscle (EMG) signals. The specific direction (EEG or EMG) will be defined as the project evolves.

What we are looking for

  • No prior experience is required – motivation and willingness to learn are the most important assets.

  • Familiarity with Unity or game development is a plus, but not mandatory.

  • Strong commitment, creativity, and enthusiasm for research in VR, neuroscience, and human-computer interaction.

What you will gain

  • Hands-on experience with VR development and biosignal-based interfaces.

  • Training and support from the SIPPRE team in signal processing, machine learning, and interactive system design.

  • Opportunity to work with state-of-the-art equipment (Meta Quest 3S, EEG/EMG systems).

  • A chance to contribute to emerging research in BCI and VR applications, opening pathways for publications and future research.

If you are curious, motivated, and ready to dive into the world where technology meets the human mind and body, this thesis is for you!

📩 Interested students should contact Assoc. Prof. Athanasios Koutras for further details.

Open Diploma Thesis Position: VR & Brain-Computer Interfaces with Meta Quest 3S Read More »

New Equipment Acquisition: Meta Quest 3S Headset for VR/AR Research

The SIPPRE Research Group has recently acquired the Meta Quest 3S headset, a state-of-the-art virtual and augmented reality device, to support ongoing and future research activities in immersive environments.

This new addition enables the development and evaluation of interactive experiments involving VR-based brain-computer interface (BCI) scenarios, augmented reality applications for signal visualization, and multi-sensory integration studies.

The device will also be used in the design of serious games and other interactive experiences developed in Unity, with a focus on biofeedback, emotional state monitoring, and cognitive load adaptationbridging extended reality (XR) with real-time physiological signals such as EEG and EDA.

It will be available for use in student diploma theses and collaborative projects combining signal processing, human-centered computing, and immersive interaction.

Getting Involved

Students and researchers interested in utilizing the Meta Quest 3S for experimental or academic purposes are encouraged to contact the lab. Priority will be given to innovative proposals that integrate extended reality (XR) technologies with biosignal-driven interaction and adaptive system design.

Stay tuned for upcoming demos and project calls involving our new VR/AR and biofeedback setup!

New Equipment Acquisition: Meta Quest 3S Headset for VR/AR Research Read More »

PlayStation Joins the SIPPRE Lab – Gaming Meets Research

We’re excited to announce that the SIPPRE Lab has acquired a PlayStation console—transforming a popular gaming platform into a powerful research and teaching tool. Far beyond entertainment, the PlayStation will support interdisciplinary research, student projects, and diploma theses in fields such as signal processing, affective computing, and human–computer interaction.

Research & Educational Use

Our PlayStation setup will be used to explore:

  • BCI-controlled gameplay using EEG and EMG signals.

  • Emotion-adaptive games, where gameplay evolves based on real-time affective responses.

  • Multiplayer neural synchrony studies using EEG during cooperative or competitive gaming.

  • Psychoacoustic research and real-time analysis of sound environments in games.

  • Visual attention tracking using eye-tracking integrated with FPS/VR game mechanics.

  • Exergaming for rehabilitation and cognitive-motor training.

  • Neurophysiological immersion and flow assessment using HRV, GSR, and EEG.

  • Benchmarking biosignal hardware with structured, game-based protocols.

Getting Involved

  • Students: If you’re interested in conducting research using this setup, submit a 1–2 page proposal outlining your research question, methodology, and timeline.

  • Deadline: Proposals for the upcoming academic term are now being accepted. Priority will be given to innovative projects that explore human–computer interaction and emotional response in interactive environments.

For inquiries or to discuss potential projects, please contact Prof. Athanasios Koutras at koutras@uop.gr.

Let’s redefine gaming—as a gateway to next-generation research!

PlayStation Joins the SIPPRE Lab – Gaming Meets Research Read More »

New Addition to the SIPPRE Lab: Tello RoboMaster TT Drone

We are excited to announce the latest addition to the SIPPRE Lab’s equipment: the Tello RoboMaster TT drone! This programmable drone, compatible with Python, opens up exciting possibilities for innovative research and student projects.

The RoboMaster TT will be a centerpiece in various projects, including:

  • Brain-Computer Interface (BCI) Applications: Controlling the drone using brain and muscle signals to explore advanced human-machine interaction techniques.
  • Computer Vision Programs: Implementing face detection, face recognition, and person tracking to enhance autonomous navigation capabilities.
  • Swarm Intelligence Studies: Investigating collaborative drone behaviors and multi-agent systems, which have applications in search and rescue or environmental monitoring.
  • Gesture-Based Control: Designing systems where gestures captured by cameras or motion sensors control the drone’s movements.
  • Augmented Reality Integration: Combining drone operation with augmented reality for immersive experiences in education or entertainment.

This versatile drone will also serve as a cornerstone for final diploma thesis projects, providing students with hands-on experience in signal processing, machine learning, robotics, and more.

Stay tuned for updates as we explore the limitless potential of this powerful tool in research and education!

New Addition to the SIPPRE Lab: Tello RoboMaster TT Drone Read More »

Creality K1C 3D Printer Joins the SIPPRE Lab!

We are thrilled to announce the arrival of the Creality K1C 3D Printer in the SIPPRE Lab! This state-of-the-art device adds a new dimension to our research and teaching capabilities, bringing advanced 3D printing technology to our projects and coursework.

The Creality K1C opens up incredible opportunities for students and researchers alike:

  • In Diploma Thesis Projects: Students can now design and prototype custom components for innovative applications, from biomedical devices to creative engineering solutions.
  • In Semester Projects: Courses like Digital Image Processing and Sound and Music Processing can leverage the printer for hands-on learning experiences.

This new tool empowers us to turn concepts into tangible creations, bridging the gap between theory and practice. We’re looking forward to seeing how students and researchers use the K1C to push the boundaries of their creativity and problem-solving skills.

Let’s get printing!

Creality K1C 3D Printer Joins the SIPPRE Lab! Read More »

New Addition to the SIPPRE Lab: Eno Enophones

We are excited to announce the acquisition of Eno Enophones, a unique combination of high-quality headphones and EEG sensors, now available in the SIPPRE Lab. These cutting-edge devices allow us to conduct simultaneous audio-based experiments and brain activity monitoring, opening new avenues for research and student projects.

The Enophones will be utilized in various projects focused on hearing experiments and studies, including:

  • Emotion Recognition During Music Listening: Investigating how different musical genres and compositions evoke emotional responses, analyzed through EEG data.
  • Auditory Attention Studies: Exploring how the brain processes and responds to auditory stimuli, providing insights into focus and distraction in complex auditory environments.
  • Cognitive Engagement with Soundscapes: Evaluating brain activity during immersive audio experiences to understand neural responses to designed auditory environments.
  • Speech Perception and Processing: Studying the neural mechanisms involved in understanding spoken language and its relationship to cognitive load and mental states.
  • Music Therapy Research: Examining the therapeutic effects of music on emotional well-being and cognitive performance using real-time brain monitoring.

These projects will provide students with hands-on experience in neuroscience, auditory signal processing, and machine learning, using this innovative technology to bridge sound and brain research.

Stay tuned for updates on how the Enophones are driving advancements in auditory research and student-driven innovation in our lab!

New Addition to the SIPPRE Lab: Eno Enophones Read More »

Enhanced EEG Capabilities in the SIPPRE Lab

We are excited to announce a significant enhancement to the EEG research capabilities in the SIPPRE Lab. Along with our existing OpenBCI Cyton and Ganglion EEG recorders, we have recently acquired the Gel-Free BCI Cap Kit, designed to deliver high-quality, reliable signals with enhanced comfort for participants.

With this new setup, we’re ready to dive deeper into exciting projects, such as:

  • Brain-Computer Interface (BCI) Applications: Using brain signals to control devices like drones, robotic arms, or even virtual environments.
  • Emotion Recognition: Exploring how brain activity reflects emotions and applying it to fields like stress management and mental health.
  • Cognitive Load and Focus Studies: Understanding how our brains handle complex tasks, with potential applications in gaming, education, and workplace productivity.
  • Music and Sound Perception: Investigating how our brains react to music and sounds, from emotional responses to cognitive engagement.
  • Motor Imagery Research: Improving how brain signals are captured during imagined movements, enhancing BCIs for gaming and rehabilitation.

This upgrade makes it easier for both students and researchers to conduct high-quality EEG experiments, whether for final diploma projects or groundbreaking studies in neuroscience, signal processing, or machine learning.

We’re looking forward to sharing the exciting discoveries that will come out of this enhanced setup—stay tuned!

Enhanced EEG Capabilities in the SIPPRE Lab Read More »

New Addition to the SIPPRE Lab: EmotiBit Sensor

We are proud to announce the latest addition to the SIPPRE Lab’s suite of equipment: the EmotiBit sensor. This wearable device provides a powerful platform for real-time physiological data acquisition, enabling researchers and students to explore a wide range of applications in emotion and health monitoring.

The EmotiBit sensor will be used in several innovative projects, including:

  • Emotion Monitoring During Media Interaction: Capturing physiological responses such as heart rate, skin conductance, and temperature while participants engage with music, videos, or other media to study emotional dynamics.
  • Stress and Relaxation Studies: Investigating physiological markers of stress and relaxation, and designing interventions to improve well-being.
  • Human-Computer Interaction (HCI): Using real-time emotional data to enhance the adaptiveness and usability of systems by responding to the user’s emotional state.
  • Computer Gaming Research: Examining player engagement, stress, and emotional responses during gaming sessions to inform game design, improve user experiences, and even develop biofeedback-based gaming.
  • Educational Research: Assessing cognitive and emotional responses during learning activities to optimize teaching methods and materials.
  • Multi-Signal Integration Projects: Combining EmotiBit data with EEG signals for advanced studies in brain-body interactions, such as exploring how physiological changes correlate with neural activity.

The EmotiBit sensor will also serve as a valuable tool for final diploma thesis projects, enabling students to gain hands-on experience with wearable technologies, physiological signal processing, and machine learning.

Stay tuned for updates on the exciting research that will emerge from the integration of EmotiBit into our lab’s activities!

New Addition to the SIPPRE Lab: EmotiBit Sensor Read More »