⚙️ Some personal, academic and experimental projects and studies, not curated, that means, some super-old-and-super-basic-projects.
Amarelinhos
📅 Year: 2023
🛠 Technologies Used: - Swift - SwiftUI
🚀 About:
Amarelinhos is an iOS app developed as a client for Carris Metropolitana (Lisbon District Bus Company). It is designed to help users explore the Lisbon district with a comprehensive bus metropolitan lines guide, providing routes, schedules, maps, and more.
💻 Development:
The app uses data from Carris Metropolitana to show routes, maps, and schedules of Lisbon metropolitan buses.
FavoritesManager
📅 Year: 2023
🛠 Technologies Used: - Swift - iCloud - NSUbiquitousKeyValueStore
🚀 About: FavoritesManager is a lightweight and easy-to-use Swift package for managing favorite items using iCloud in your project.
💻 Development: FavoritesManager uses NSUbiquitousKeyValueStore to store and synchronize data across devices, providing a simple and efficient way to manage a list of favorite items.
SpeedManager
📅 Year: 2022
🛠 Technologies Used: - Swift - Core Location
🚀 About: SpeedManagerModule is a simple speedometer class for iOS and watchOS. The class measures speed using an iPhone or Apple Watch, based on CLLocation.
💻 Development: I developed this module because I wanted a speedometer app that was simple, had no ads, and had the features I wanted. I wanted an Apple Watch speedometer with complications and an iOS app with widgets, but I couldn't find one that suited my needs. So, I decided to create my own app. I started by measuring speed using CLLocationManager.
YouTube Metadata
📅 Year: 2020
🛠 Technologies Used:
- Swift
- Node.js
🚀 About:
YouTube Metadata is an npm package and a Swift wrapper that helps you get YouTube metadata from a URL without using the YouTube API.
💻 Development:
There is a Swift client and an npm module with the same purpose.
Apple Watch Gesture Recognition
Master's degree project
📅 Year: 2020-2021
🛠 Technologies Used: - CoreML - CreateML - CoreMotion - WatchConnectivity
🚀 About: This research aimed to develop a study on the use of wearables in games, particularly the detection of gestures to use as input devices in games.
💻 Development: As part of my Professional Master's degree in Digital Games Development, the prototype development included modelling gesture detection using neural networks and reporting the process of development and experimentation of hardware and software, their architecture, and integration. I had the opportunity to work with various Apple Ecosystem APIs, including CoreML, CreateML, CoreMotion, and WatchConnectivity.
Plano Inclinado REA
📅 Year: 2021
🛠 Technologies Used: - HTML5 - CSS3 - Vanilla JavaScript - Canvas API - MediaRecorder
🚀 About: Plano Inclinado REA is a very simple inclined plane web simulator hosted on GitHub Pages.
💻 Development: Plano Inclinado REA was developed for the REA (Recursos Educacionais Abertos) - OER (Open Educational Resources) discipline in my specialization in computing applied to education at the University of Sao Paulo (USP). HTML5, CSS3 and vanilla JavaScript with Canvas API were used in the project, along with some new resources like the MediaRecorder. The MediaRecorder is an interface of the MediaStream Recording API that provides functionality to easily record media. The project focuses on licenses and distribution, and how to work with popular OER repositories in Brazil.
🔗 Links: - Plano Inclinado REA GitHub Repository
NFCPlay
📅 Year: 2020
🛠 Technologies Used: - Native code (for both platforms) - Unity3D
🚀 About: NFCPlay is a project that presents the use of an NFC tag (Near Field Communication) as an input mechanism on VR headsets.
💻 Development: User input in Virtual Reality (VR) mobile games on the touch screen has been a problem in the gaming industry for many years. Some solutions using timers, waiting to look at something to act, have been proposed and used in several games. Some VR headset models had a magnetic button on the side. However, this button caused interference in the magnetometer and the GPS module, essential modules for developing games using VR. The framework was created in native code for both platforms and unified in a wrapper in Unity3D. The framework generates an event-based subscribe activated by any NFC tag, which simulates a touch on an object in the game.
🔗 Links: - NFCPlay Paper
JoyPen
📅 Year: 2020
🛠 Technologies Used: - Python - OpenCV
🚀 About: JoyPen is a small app built in Python using OpenCV to track the movements from a cardboard joystick, to control racing games.
💻 Development: JoyPen was developed as part of my master's project during the 2020 COVID pandemic. The project aimed to use games to stimulate creativity, using items already available at home. The project idea was a cardboard joystick to control racing games, and the cardboard scheme to cut and build is available to download along with a small app built in Python using OpenCV to track the movements.
Damas
📅 Year: 2020
🛠 Technologies Used: - Unity 3D - C#
🚀 About: Developed using the Unity 3D and C#, Damas is a checkers game, but without any use of classical artificial intelligence algorithms.
💻 Development: The app was developed using the Unity 3D and C#. The challenge was balancing the AI so that the player doesn't notice if it's an AI or a player, but without any use of classical artificial intelligence algorithms. To solve without AI, I created a matrix and checked the neighbors every movement.
Genetic Racing
📅 Year: 2020
🛠 Technologies Used: - Unity3D engine - C#
🚀 About: Developed using the Unity3D engine and C#, Genetic Racing uses a genetic algorithm (GA) to approximately find and learn to cross the racetrack.
💻 Development: Genetic Racing was developed using the Unity3D engine and C# as part of Lab I class in my master's course.
CIAP 2
📅 Year: 2019
🚀 About: This app is for the Brazilian version of CIAP2 (ICAP2), used in primary care by family doctors, nurses, psychologists, social workers, assistants, and others.
💻 Development: CIAP or (ICAP) is a code used in primary care by family doctors, nurses, psychologists, social workers, assistants, and others. On the development side, I used Swift, with a hybrid approach on the UI with View Code and Storyboards, a simple MVVM, CollectionViews, CloudKit for favorites sync, and Cocoapods to manage dependencies. On Android, I used Kotlin, MVP, Fragments, and RecycleViews. The app has authorization from the Grupo de Trabalho de Prontuário e Classificação of SBMFC, for tables available on the offical book.
WatchShaker
📅 Year: 2017
🛠 Technologies Used: - watchOS
🚀 About: Simple motion detector for ⌚️ (watchOS) shake gesture. Shake your Apple Watch! WatchShaker is a watchOS helper to get your ⌚️ shake movements.
💻 Development: My motivation: Why not? - Actually the real motivation was during a project development I noticed every smartphone has an API to get the shake, Apple Watch has the same sensors and doesn't have this access? Shake your Apple Watch! WatchShaker is a watchOS helper to get your ⌚️ shake movements. You can get the direction of the shake on the
didShakeWith
method.ShakeDirection
is a simple enum who gives you up, down, left or right direction. WatchShaker was featured on the #118 edition of This Week in Swift Newsletter.Watch Clicker Presentation
📅 Year: 2017
🛠 Technologies Used: - Watch Connectivity - Multipper Connectivity - Simulated KeyPress
🚀 About: It's 2017, why not control your slides transition (next and previous), using your Apple Watch? Let's do it.
💻 Development: The Apple Watch app was developed as a "side app" of an iPhone app. Also a menu bar app was built to start everything. Basically the path is: ⌚️ Watch Connectivity 📲 Multipper Connectivity ➡ 💻 Simulate a KeyPress.
Master Exploder
📅 Year: 2015
🛠 Technologies Used: - Image segmentation - Convex hull construction with Jarvis's Algorithm - Area characteristics extraction
🚀 About: A game controlled by computer vision based on image segmentation and construction of a convex hull with Jarvis’s Algorithm in the identification of hand gestures (hand-tracking).
💻 Development: The proposed work is based on image segmentation and construction of a convex hull with Jarvis’s Algorithm, and determination of the pattern based on the extraction of area characteristics in the convex hull. The name "Master Exploder" is a reference to a Tenacius D music from the album (and movie) The Pick of Destiny. The challenge was using computer vision, but without OpenCV, doing everything by hand to understand the principles behind. We made a poster explaining the project, it's available in Portuguese here.
📅 Year of Development: 2015
🛠 Technologies Used: - Objective-C - C++ - NodeJS
🚀 About: Project made during a hackathon in a Campus party 2015. A smart pill box, integrated with an iOS App.
💻 Development: I was learning everything that I used (Objective-C, C++, NodeJS). That is my spirit in hackathons - having fun and learning. And in this I won that :) 🏅 This project was made during a hackathon in a Campus party. I used an internet of things kit from Telefónica (Sponsor and organizer of the hackathon). It was cool, great people, cool coding, learning a lot, the results on press.
ℹ️ Post: about this project here
📅 Year: 2013
🛠 Technologies Used: - C programming language - Allegro 5 graphic library
🚀 About: An educational computer game called LogiKid, where the objective is the teaching and learning of logic gates and Boolean logic.
💻 Development: First year undergraduate project :). The project was developed in C Lang, using the Allegro 5 graphic library. Great times. Oh boy. Thanks to GitHub to preserve this project!