Equipment

New Equipment Acquisition: Meta Quest 3S Headset for VR/AR Research

The SIPPRE Research Group has recently acquired the Meta Quest 3S headset, a state-of-the-art virtual and augmented reality device, to support ongoing and future research activities in immersive environments.

This new addition enables the development and evaluation of interactive experiments involving VR-based brain-computer interface (BCI) scenarios, augmented reality applications for signal visualization, and multi-sensory integration studies.

The device will also be used in the design of serious games and other interactive experiences developed in Unity, with a focus on biofeedback, emotional state monitoring, and cognitive load adaptationbridging extended reality (XR) with real-time physiological signals such as EEG and EDA.

It will be available for use in student diploma theses and collaborative projects combining signal processing, human-centered computing, and immersive interaction.

Getting Involved

Students and researchers interested in utilizing the Meta Quest 3S for experimental or academic purposes are encouraged to contact the lab. Priority will be given to innovative proposals that integrate extended reality (XR) technologies with biosignal-driven interaction and adaptive system design.

Stay tuned for upcoming demos and project calls involving our new VR/AR and biofeedback setup!

New Equipment Acquisition: Meta Quest 3S Headset for VR/AR Research Read More »

PlayStation Joins the SIPPRE Lab – Gaming Meets Research

We’re excited to announce that the SIPPRE Lab has acquired a PlayStation console—transforming a popular gaming platform into a powerful research and teaching tool. Far beyond entertainment, the PlayStation will support interdisciplinary research, student projects, and diploma theses in fields such as signal processing, affective computing, and human–computer interaction.

Research & Educational Use

Our PlayStation setup will be used to explore:

  • BCI-controlled gameplay using EEG and EMG signals.

  • Emotion-adaptive games, where gameplay evolves based on real-time affective responses.

  • Multiplayer neural synchrony studies using EEG during cooperative or competitive gaming.

  • Psychoacoustic research and real-time analysis of sound environments in games.

  • Visual attention tracking using eye-tracking integrated with FPS/VR game mechanics.

  • Exergaming for rehabilitation and cognitive-motor training.

  • Neurophysiological immersion and flow assessment using HRV, GSR, and EEG.

  • Benchmarking biosignal hardware with structured, game-based protocols.

Getting Involved

  • Students: If you’re interested in conducting research using this setup, submit a 1–2 page proposal outlining your research question, methodology, and timeline.

  • Deadline: Proposals for the upcoming academic term are now being accepted. Priority will be given to innovative projects that explore human–computer interaction and emotional response in interactive environments.

For inquiries or to discuss potential projects, please contact Prof. Athanasios Koutras at koutras@uop.gr.

Let’s redefine gaming—as a gateway to next-generation research!

PlayStation Joins the SIPPRE Lab – Gaming Meets Research Read More »

New Addition to the SIPPRE Lab: Tello RoboMaster TT Drone

We are excited to announce the latest addition to the SIPPRE Lab’s equipment: the Tello RoboMaster TT drone! This programmable drone, compatible with Python, opens up exciting possibilities for innovative research and student projects.

The RoboMaster TT will be a centerpiece in various projects, including:

  • Brain-Computer Interface (BCI) Applications: Controlling the drone using brain and muscle signals to explore advanced human-machine interaction techniques.
  • Computer Vision Programs: Implementing face detection, face recognition, and person tracking to enhance autonomous navigation capabilities.
  • Swarm Intelligence Studies: Investigating collaborative drone behaviors and multi-agent systems, which have applications in search and rescue or environmental monitoring.
  • Gesture-Based Control: Designing systems where gestures captured by cameras or motion sensors control the drone’s movements.
  • Augmented Reality Integration: Combining drone operation with augmented reality for immersive experiences in education or entertainment.

This versatile drone will also serve as a cornerstone for final diploma thesis projects, providing students with hands-on experience in signal processing, machine learning, robotics, and more.

Stay tuned for updates as we explore the limitless potential of this powerful tool in research and education!

New Addition to the SIPPRE Lab: Tello RoboMaster TT Drone Read More »

Creality K1C 3D Printer Joins the SIPPRE Lab!

We are thrilled to announce the arrival of the Creality K1C 3D Printer in the SIPPRE Lab! This state-of-the-art device adds a new dimension to our research and teaching capabilities, bringing advanced 3D printing technology to our projects and coursework.

The Creality K1C opens up incredible opportunities for students and researchers alike:

  • In Diploma Thesis Projects: Students can now design and prototype custom components for innovative applications, from biomedical devices to creative engineering solutions.
  • In Semester Projects: Courses like Digital Image Processing and Sound and Music Processing can leverage the printer for hands-on learning experiences.

This new tool empowers us to turn concepts into tangible creations, bridging the gap between theory and practice. We’re looking forward to seeing how students and researchers use the K1C to push the boundaries of their creativity and problem-solving skills.

Let’s get printing!

Creality K1C 3D Printer Joins the SIPPRE Lab! Read More »

New Addition to the SIPPRE Lab: Eno Enophones

We are excited to announce the acquisition of Eno Enophones, a unique combination of high-quality headphones and EEG sensors, now available in the SIPPRE Lab. These cutting-edge devices allow us to conduct simultaneous audio-based experiments and brain activity monitoring, opening new avenues for research and student projects.

The Enophones will be utilized in various projects focused on hearing experiments and studies, including:

  • Emotion Recognition During Music Listening: Investigating how different musical genres and compositions evoke emotional responses, analyzed through EEG data.
  • Auditory Attention Studies: Exploring how the brain processes and responds to auditory stimuli, providing insights into focus and distraction in complex auditory environments.
  • Cognitive Engagement with Soundscapes: Evaluating brain activity during immersive audio experiences to understand neural responses to designed auditory environments.
  • Speech Perception and Processing: Studying the neural mechanisms involved in understanding spoken language and its relationship to cognitive load and mental states.
  • Music Therapy Research: Examining the therapeutic effects of music on emotional well-being and cognitive performance using real-time brain monitoring.

These projects will provide students with hands-on experience in neuroscience, auditory signal processing, and machine learning, using this innovative technology to bridge sound and brain research.

Stay tuned for updates on how the Enophones are driving advancements in auditory research and student-driven innovation in our lab!

New Addition to the SIPPRE Lab: Eno Enophones Read More »

Enhanced EEG Capabilities in the SIPPRE Lab

We are excited to announce a significant enhancement to the EEG research capabilities in the SIPPRE Lab. Along with our existing OpenBCI Cyton and Ganglion EEG recorders, we have recently acquired the Gel-Free BCI Cap Kit, designed to deliver high-quality, reliable signals with enhanced comfort for participants.

With this new setup, we’re ready to dive deeper into exciting projects, such as:

  • Brain-Computer Interface (BCI) Applications: Using brain signals to control devices like drones, robotic arms, or even virtual environments.
  • Emotion Recognition: Exploring how brain activity reflects emotions and applying it to fields like stress management and mental health.
  • Cognitive Load and Focus Studies: Understanding how our brains handle complex tasks, with potential applications in gaming, education, and workplace productivity.
  • Music and Sound Perception: Investigating how our brains react to music and sounds, from emotional responses to cognitive engagement.
  • Motor Imagery Research: Improving how brain signals are captured during imagined movements, enhancing BCIs for gaming and rehabilitation.

This upgrade makes it easier for both students and researchers to conduct high-quality EEG experiments, whether for final diploma projects or groundbreaking studies in neuroscience, signal processing, or machine learning.

We’re looking forward to sharing the exciting discoveries that will come out of this enhanced setup—stay tuned!

Enhanced EEG Capabilities in the SIPPRE Lab Read More »

New Addition to the SIPPRE Lab: EmotiBit Sensor

We are proud to announce the latest addition to the SIPPRE Lab’s suite of equipment: the EmotiBit sensor. This wearable device provides a powerful platform for real-time physiological data acquisition, enabling researchers and students to explore a wide range of applications in emotion and health monitoring.

The EmotiBit sensor will be used in several innovative projects, including:

  • Emotion Monitoring During Media Interaction: Capturing physiological responses such as heart rate, skin conductance, and temperature while participants engage with music, videos, or other media to study emotional dynamics.
  • Stress and Relaxation Studies: Investigating physiological markers of stress and relaxation, and designing interventions to improve well-being.
  • Human-Computer Interaction (HCI): Using real-time emotional data to enhance the adaptiveness and usability of systems by responding to the user’s emotional state.
  • Computer Gaming Research: Examining player engagement, stress, and emotional responses during gaming sessions to inform game design, improve user experiences, and even develop biofeedback-based gaming.
  • Educational Research: Assessing cognitive and emotional responses during learning activities to optimize teaching methods and materials.
  • Multi-Signal Integration Projects: Combining EmotiBit data with EEG signals for advanced studies in brain-body interactions, such as exploring how physiological changes correlate with neural activity.

The EmotiBit sensor will also serve as a valuable tool for final diploma thesis projects, enabling students to gain hands-on experience with wearable technologies, physiological signal processing, and machine learning.

Stay tuned for updates on the exciting research that will emerge from the integration of EmotiBit into our lab’s activities!

New Addition to the SIPPRE Lab: EmotiBit Sensor Read More »