With the arrival of multimedia innovations such as games, access to media has changed, creating novelty in media consumption. Oversteer is a project that takes advantage of gaming media to allow users to experience driving similar to driving on a racetrack.
เกิดจากความชอบใน รถยนต์ เครื่องยนต์ และเกม จึงนำความชอบมารวมกัน และได้เป็นโครงงานนี้

คณะเทคโนโลยีสารสนเทศ
The process of treating cancer patients in the chemotherapy department at Chonburi Cancer Hospital is complicated and inconvenient due to the procedure of submitting blood test results through the personal LINE application of medical staff, which hinders workflow efficiency. Therefore, the researcher has developed a cancer patient management and tracking program in the form of a web-based application and LINE LIFF (LINE Front-end Framework) application to facilitate both medical personnel and patients. The web-based application is designed for medical personnel to monitor, schedule, and collect patient data, while the LINE application is designed for patients to submit blood test results, view appointment schedules, record symptoms after chemotherapy, log their weekly weight, and access a chatbot for consultation. This system is developed based on client-server technology, which enhances data analysis efficiency and supports automated treatment planning. As a result, the cancer treatment process becomes faster, more modern, and more efficient.

คณะสถาปัตยกรรม ศิลปะและการออกแบบ
-

คณะวิศวกรรมศาสตร์
The Thai Sign Language Generation System aims to create a comprehensive 3D modeling and animation platform that translates Thai sentences into dynamic and accurate representations of Thai Sign Language (TSL) gestures. This project enhances communication for the Thai deaf community by leveraging a landmark-based approach using a Vector Quantized Variational Autoencoder (VQVAE) and a Large Language Model (LLM) for sign language generation. The system first trains a VQVAE encoder using landmark data extracted from sign videos, allowing it to learn compact latent representations of TSL gestures. These encoded representations are then used to generate additional landmark-based sign sequences, effectively expanding the training dataset using the BigSign ThaiPBS dataset. Once the dataset is augmented, an LLM is trained to output accurate landmark sequences from Thai text inputs, which are then used to animate a 3D model in Blender, ensuring fluid and natural TSL gestures. The project is implemented using Python, incorporating MediaPipe for landmark extraction, OpenCV for real-time image processing, and Blender’s Python API for 3D animation. By integrating AI, VQVAE-based encoding, and LLM-driven landmark generation, this system aspires to bridge the communication gap between written Thai text and expressive TSL gestures, providing the Thai deaf community with an interactive, real-time sign language animation platform.