Transformer From Scratch
C 64 completedPipeline State
completedPipeline Metadata
AI Prompt
Catalog Information
This project demonstrates a complete Transformer/GPT-style model implemented entirely from scratch in modern C++ (C++17) to highlight systems-level understanding.
Description
The project is designed to showcase the implementation of a Transformer/GPT-style model without relying on deep learning frameworks or external numerical libraries. It includes tensors, autograd, attention, optimizer, and a tokenizer to demonstrate end-to-end training and inference mechanics. The code highlights systems-level understanding by implementing memory layout, numerics, backprop, and training loops in clean, readable C++.
الوصف
هذا المشروع يظهر تنفيذ نموذج ترميزي/ GPT-الوضع من الصفر في C ++ الحديث (C ++17) لتعزيز فهم المستوى النظامي. يتضمن التطبيق تروس، autograd، الانتباه، optimizer، وtokenizer للتوضيح عن التدريب والتشغيل المباشر. يظهر الكود فهم المستوى النظامي عن طريق تنفيذ ترتيب الذاكرة، العددية، backprop، و حلول التدريب في C ++ النظيفة.
Novelty
7/10Tags
Claude Models
Quality Score
Strengths
- Consistent naming conventions (snake_case)
- Good security practices \u2014 no major issues detected
Weaknesses
- No LICENSE file \u2014 legal ambiguity for contributors
- No CI/CD configuration \u2014 manual testing and deployment
- 1689 duplicate lines detected \u2014 consider DRY refactoring
- 3 'god files' with >500 LOC need decomposition
Recommendations
- Set up CI/CD (GitHub Actions recommended) to automate testing and deployment
- Add a linter configuration to enforce code style consistency
- Add a LICENSE file (MIT recommended for open source)
Security & Health
Languages
Frameworks
Concepts (2)
| Category | Name | Description | Confidence | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| About: code-quality intelligence by Repobility · https://repobility.com | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| auto_description | Project Description | A minimal yet complete Transformer/GPT-style model implemented entirely from scratch in modern C++ (C++17). No deep learning frameworks or external numerical libraries—just raw tensors, autograd, attention, optimizer, and a tiny tokenizer to demonstrate end-to-end training and inference mechanics. | 80% | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| auto_category | Data/ML | data-ml | 70% | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Embed Badge
Add to your README:
