Sebastien De Greef

Professional Summary:

With over two decades of experience in software engineering, I have carved a niche in artificial intelligence (AI), natural language processing (NLP), and advanced data analysis. My journey into AI began over a decade ago, fueled by a fascination with early NLP technologies, back when the field relied heavily on rule-based systems like POS taggers and Bags of Words.

Over the years, I have honed my skills in sentiment analysis, named entity recognition (NER), and knowledge graph construction. My work has involved scraping vast datasets from the nascent stages of the Semantic Web, utilizing formats like RDF and OWL, and transforming these structured and unstructured data into insightful, actionable knowledge. This foundational work in data collection and analysis has set the stage for my later explorations into more complex AI domains. My technical expertise extends to writing compilers, where I have deepened my understanding of language processing, tokenizers and parsers, turning programming and natural languages into executable instructions and vice versa. This expertise has been instrumental in developing advanced tokenization techniques, context window configurations, and sequence-to-sequence (Seq2Seq) models, many of which I have crafted for personal projects and training.

Driven by an unyielding commitment to self-improvement and mastery of state-of-the-art (SOTA) techniques, I recently dedicated four months to intensive training, focusing exclusively on cutting-edge AI technologies. This strategic pause was a leap of faith for transitioning from a successful career in software development to pursue my passion for AI full-time.

In addition, I developed https://huggingface.co/spaces/sebdg/ai-cookbook, a comprehensive guide that documents my journey and knowledge in AI. This resource covers theoretical foundations, practical applications, and curated educational materials, showcasing my ability to self-teach, document complex technical knowledge, and create valuable educational resources for the AI community. It includes my work with advanced tools and libraries such as CrewAI, LangChain, TensorFlow, TensorBoard, Torch, TTS (Text-to-Speech), and STT (Speech-to-Text). I am also proficient in Python, Pandas, and LabelStudio.

I have published an article on LinkedIn discussing the concept of https://www.linkedin.com/pulse/what-good-enough-ai-s%2525C3%2525A9bastien-de-greef-euhfe/?trackingId=xwEq6HloQbugNnA6RTGcaQ%3D%3D in AI, focusing on achieving efficiency without compromising performance. I earned a badge as a Machine Learning Top Voice and am very active in online communities like the Ollama Discord, where I am a referent and go-to person. I also hold weekly hands-on live presentations about various AI aspects.

My recent projects include developing an emotions classifier and fine-tuning Llama3 models for emotion and sentiment analysis, as well as creating a fine-tuned dataset for function calling on Llama and Phi3 models on https://huggingface.co/sebdg. These projects demonstrate my capability to adapt and innovate with the latest AI technologies. I have created an advanced https://sebdg-portfolio.static.hf.space/object-detection.html utilizing YOLO (You Only Look Once) and SAM Segment-Anything. This project demonstrates my expertise in implementing object detection algorithms, training models on diverse datasets, and achieving high accuracy in real-time detection tasks. The project showcases practical applications of object detection in various scenarios, highlighting my proficiency in this cutting-edge technology.

Furthermore, I developed a https://chatgpt.com/g/g-BUfysmfX4-ruby-cruit project, which leverages a custom GPT model tailored to present me as a candidate to companies. This AI tool enhances my professional presence by engaging potential employers and inviting them to have a conversation with “Ruby,” my AI-powered representative. Currently, I am working on a project to map supply chains through AI and information gathering. This involves using Retrieval-Augmented Generation (RAG) systems with vector stores, scrapers, and D3.js visualization techniques. This project will be presented at Windesheim University of Applied Sciences, underscoring my ability to integrate multiple technologies to solve complex real-world problems.

As I seek opportunities to bring my extensive background and fresh training to innovative AI projects, I am eager to contribute to a team that values pioneering solutions, continuous learning, and real-world impact. My career is a testament to my belief that with the right mix of data, technology, and creativity, the possibilities are limitless.