about me
Hi, I'm Julian.
This site serves as documentation for all of my different projects, as well as a personal website where I can talk about things I'm interested in.
You can use the navigation sidebar to visit my different posts and documentation.
languages
Here are the different programming languages i've learnt and used, ranked from greatest understanding to weakest understanding.
softwares
Here are some of the softwares I've had experience using:
humanoid_robot
WIP
thesis
2025
WIP
micro_mouse
2024
Third year of uni meant doing the micromouse competition. This competition, inspired by the worldwide Micro-mouse events, asked us to develop a small autonomous robot capable of completing a obstacle course comprised of 16x16 mazes, a winding causeway and a spiral ramp. To do this, I used IR sensors, due to their ease of use and the difference in color between the
The microcontroller required for this project was a Nucleo L432KC, a type of small STM32 board. Similar to the Face Robot, low-level C was required here.

Artistic Depiction of the Micro Mouse Robot
face_robot
2024
In my second year of uni, I worked on a small robotics project for one of my classes. The goal of this project was to develop a life-sized face robot that could realistically display a few emotions through preprogrammed animations.
As such, we were required to have eyes that moved at least from side-to-side, a mouth that could open and close as well as eyebrows. These elements allowed the robot to display the required emotional animations for the rubric.
Additionally, we were allowed to develop an X-factor, an custom element that would aid the realism of the face robot. Notably for this, we were allowed to use derivative designs. For this, I chose to employ Will Cogley's EyeMech Design. Unlike the bare-minimum required design, this mechanism allowd for up-and-down movement as well, which meant that the eyes could move in any direction, just like a real human. For the Jaw, I used a single MG90s servo motor inside the bearing-suspended jaw to rotate it. Two more MG90s servo motors controlled the popsicle-stick eyebrows. Together, these elements gave a suprisingly realistic face movements.
To control the robot, there were two modes. The first mode allowed the user to control the robot through a set of buttons that would display specific facial movements (happy/sad/surprised). The second mode cycled through the different emotions until its internal microphone heard a volume above a certain threshold, where it would display a suprised emotion for a few seconds before going back to cycling through the different emotions.
The Mega 2560 R3 microcontroller was programmed in low-level Arduino, and no libraries were allowed for submission.

Artistic Depiction of the Face Robot
warman_challenge
2023
WIP
snake
Use arrow keys to move. Avoid the walls and your own tail!
[restart]friends
Here are some of my friends websites: