Found 656 repositories(showing 30)
Xerophayze
TTS-Story is a web-based multi‑voice TTS studio for turning tagged scripts into audiobooks—featuring full speaker management, chunk review/regeneration, a job queue and library system, and local GPU or API backends including Kokoro, Chatterbox, VOX CPM, Pocket-TTS, Kitten-TTS, IndexTTS-2, QWEN3 TTS and Omnivoice engines
Hazrat-Ali9
🚂 Job 🚃 Management 🚄 Backend 🚞 Powerful 🚋 RESTful API 🚎 Smart Job ✈ Tracking 🛩 Career 🛬 Management 🚁 contains the 🚀 backend 🚟 system of a 🚠 modern Job 🛸Platform 🏩 designed to 🕍 streamline 🏰 the process 🏦 of job posting 🏤 tracking 🏡 applications 🚒 managing ⛸ candidates 🎮 and more all ⚽ built scalability ⚾ and real 🥎 world use.
wgzhao
Addax Admin is a web-based management console for Addax ETL jobs, offering task scheduling, monitoring, and data management with a Vue 3 frontend and Spring Boot backend.
CaptainEFFF
# All the News That's Fit to Scrape ### Overview In this assignment, you'll create a web app that lets users view and leave comments on the latest news. But you're not going to actually write any articles; instead, you'll flex your Mongoose and Cheerio muscles to scrape news from another site. ### Before You Begin 1. Create a GitHub repo for this assignment and clone it to your computer. Any name will do -- just make sure it's related to this project in some fashion. 2. Run `npm init`. When that's finished, install and save these npm packages: 1. express 2. express-handlebars 3. mongoose 4. cheerio 5. axios 3. **NOTE**: If you want to earn complete credit for your work, you must use all five of these packages in your assignment. 4. In order to deploy your project to Heroku, you must set up an mLab provision. mLab is remote MongoDB database that Heroku supports natively. Follow these steps to get it running: 5. Create a Heroku app in your project directory. 6. Run this command in your Terminal/Bash window: * `heroku addons:create mongolab` * This command will add the free mLab provision to your project. 7. When you go to connect your mongo database to mongoose, do so the following way: ```js // If deployed, use the deployed database. Otherwise use the local mongoHeadlines database var MONGODB_URI = process.env.MONGODB_URI || "mongodb://localhost/mongoHeadlines"; mongoose.connect(MONGODB_URI); ``` * This code should connect mongoose to your remote mongolab database if deployed, but otherwise will connect to the local mongoHeadlines database on your computer. 8. [Watch this demo of a possible submission](https://youtu.be/4ltZr3VPmno). See the deployed demo application [here](http://nyt-mongo-scraper.herokuapp.com/). 9. Your site doesn't need to match the demo's style, but feel free to attempt something similar if you'd like. Otherwise, just be creative! ### Commits Having an active and healthy commit history on GitHub is important for your future job search. It is also extremely important for making sure your work is saved in your repository. If something breaks, committing often ensures you are able to go back to a working version of your code. * Committing often is a signal to employers that you are actively working on your code and learning. * We use the mantra “commit early and often.” This means that when you write code that works, add it and commit it! * Numerous commits allow you to see how your app is progressing and give you a point to revert to if anything goes wrong. * Be clear and descriptive in your commit messaging. * When writing a commit message, avoid vague messages like "fixed." Be descriptive so that you and anyone else looking at your repository knows what happened with each commit. * We would like you to have well over 200 commits by graduation, so commit early and often! ### Submission on BCS * **This assignment must be deployed.** * Please submit both the deployed Heroku link to your homework AND the link to the Github Repository! ## Instructions * Create an app that accomplishes the following: 1. Whenever a user visits your site, the app should scrape stories from a news outlet of your choice and display them for the user. Each scraped article should be saved to your application database. At a minimum, the app should scrape and display the following information for each article: * Headline - the title of the article * Summary - a short summary of the article * URL - the url to the original article * Feel free to add more content to your database (photos, bylines, and so on). 2. Users should also be able to leave comments on the articles displayed and revisit them later. The comments should be saved to the database as well and associated with their articles. Users should also be able to delete comments left on articles. All stored comments should be visible to every user. * Beyond these requirements, be creative and have fun with this! ### Tips * Go back to Saturday's activities if you need a refresher on how to partner one model with another. * Whenever you scrape a site for stories, make sure an article isn't already represented in your database before saving it; Do not save any duplicate entries. * Don't just clear out your database and populate it with scraped articles whenever a user accesses your site. * If your app deletes stories every time someone visits, your users won't be able to see any comments except the ones that they post. ### Helpful Links * [MongoDB Documentation](https://docs.mongodb.com/manual/) * [Mongoose Documentation](http://mongoosejs.com/docs/api.html) * [Cheerio Documentation](https://github.com/cheeriojs/cheerio) ### Reminder: Submission on BCS * Please submit both the deployed Heroku link to your homework AND the link to the Github Repository! --- ### Minimum Requirements * **This assignment must be deployed.** Attempt to complete homework assignment as described in instructions. If unable to complete certain portions, please pseudocode these portions to describe what remains to be completed. Hosting on Heroku and adding a README.md are required for this homework. In addition, add this homework to your portfolio, more information can be found below. --- ### Hosting on Heroku Now that we have a backend to our applications, we use Heroku for hosting. Please note that while **Heroku is free**, it will request credit card information if you have more than 5 applications at a time or are adding a database. Please see [Heroku’s Account Verification Information](https://devcenter.heroku.com/articles/account-verification) for more details. --- ### Create a README.md Add a `README.md` to your repository describing the project. Here are some resources for creating your `README.md`. Here are some resources to help you along the way: * [About READMEs](https://help.github.com/articles/about-readmes/) * [Mastering Markdown](https://guides.github.com/features/mastering-markdown/) --- ### Add To Your Portfolio After completing the homework please add the piece to your portfolio. Make sure to add a link to your updated portfolio in the comments section of your homework so the TAs can easily ensure you completed this step when they are grading the assignment. To receive an 'A' on any assignment, you must link to it from your portfolio. --- ### One Last Thing If you have any questions about this project or the material we have covered, please post them in the community channels in slack so that your fellow developers can help you! If you're still having trouble, you can come to office hours for assistance from your instructor and TAs. That goes threefold for this unit: MongoDB and Mongoose compose a challenging data management system. If there's anything you find confusing about these technologies, don't hesitate to speak with someone from the Boot Camp team. **Good Luck!**
Rwad Al-Furas Backend: Job matching, freelance project management, and donation facilitation API. Built with Django, DRF, JWT, Redis, and PostgreSQL with Trigram similarity for advanced search.
gilbertozioma
I developed and seamlessly integrated the backend for JobRadar, empowering users to effortlessly register, login, browse jobs, and create/manage their own listings. Experience a user-friendly platform for streamlined job search and management.
NightBringerZied
Jobick Backend is a robust and scalable RESTful API built with Go (Gin Framework) and PostgreSQL, designed to power a modern job portal connecting candidates and recruiters. It provides authentication, job management, applications, file handling, real-time notifications via WebSockets, and complete API documentation using Swagger.\
jdrenteria
# All the News That's Fit to Scrape ### Overview In this assignment, you'll create a web app that lets users view and leave comments on the latest news. But you're not going to actually write any articles; instead, you'll flex your Mongoose and Cheerio muscles to scrape news from another site. ### Before You Begin 1. Create a GitHub repo for this assignment and clone it to your computer. Any name will do -- just make sure it's related to this project in some fashion. 2. Run `npm init`. When that's finished, install and save these npm packages: 1. express 2. express-handlebars 3. mongoose 4. cheerio 5. axios 3. **NOTE**: If you want to earn complete credit for your work, you must use all five of these packages in your assignment. 4. In order to deploy your project to Heroku, you must set up an mLab provision. mLab is remote MongoDB database that Heroku supports natively. Follow these steps to get it running: 5. Create a Heroku app in your project directory. 6. Run this command in your Terminal/Bash window: * `heroku addons:create mongolab` * This command will add the free mLab provision to your project. 7. When you go to connect your mongo database to mongoose, do so the following way: ```js // If deployed, use the deployed database. Otherwise use the local mongoHeadlines database var MONGODB_URI = process.env.MONGODB_URI || "mongodb://localhost/mongoHeadlines"; mongoose.connect(MONGODB_URI); ``` * This code should connect mongoose to your remote mongolab database if deployed, but otherwise will connect to the local mongoHeadlines database on your computer. 8. [Watch this demo of a possible submission](https://youtu.be/4ltZr3VPmno). See the deployed demo application [here](http://nyt-mongo-scraper.herokuapp.com/). 9. Your site doesn't need to match the demo's style, but feel free to attempt something similar if you'd like. Otherwise, just be creative! ### Commits Having an active and healthy commit history on GitHub is important for your future job search. It is also extremely important for making sure your work is saved in your repository. If something breaks, committing often ensures you are able to go back to a working version of your code. * Committing often is a signal to employers that you are actively working on your code and learning. * We use the mantra “commit early and often.” This means that when you write code that works, add it and commit it! * Numerous commits allow you to see how your app is progressing and give you a point to revert to if anything goes wrong. * Be clear and descriptive in your commit messaging. * When writing a commit message, avoid vague messages like "fixed." Be descriptive so that you and anyone else looking at your repository knows what happened with each commit. * We would like you to have well over 200 commits by graduation, so commit early and often! ### Submission on BCS * **This assignment must be deployed.** * Please submit both the deployed Heroku link to your homework AND the link to the Github Repository! ## Instructions * Create an app that accomplishes the following: 1. Whenever a user visits your site, the app should scrape stories from a news outlet of your choice and display them for the user. Each scraped article should be saved to your application database. At a minimum, the app should scrape and display the following information for each article: * Headline - the title of the article * Summary - a short summary of the article * URL - the url to the original article * Feel free to add more content to your database (photos, bylines, and so on). 2. Users should also be able to leave comments on the articles displayed and revisit them later. The comments should be saved to the database as well and associated with their articles. Users should also be able to delete comments left on articles. All stored comments should be visible to every user. * Beyond these requirements, be creative and have fun with this! ### Tips * Go back to Saturday's activities if you need a refresher on how to partner one model with another. * Whenever you scrape a site for stories, make sure an article isn't already represented in your database before saving it; Do not save any duplicate entries. * Don't just clear out your database and populate it with scraped articles whenever a user accesses your site. * If your app deletes stories every time someone visits, your users won't be able to see any comments except the ones that they post. ### Helpful Links * [MongoDB Documentation](https://docs.mongodb.com/manual/) * [Mongoose Documentation](http://mongoosejs.com/docs/api.html) * [Cheerio Documentation](https://github.com/cheeriojs/cheerio) ### Reminder: Submission on BCS * Please submit both the deployed Heroku link to your homework AND the link to the Github Repository! --- ### Minimum Requirements * **This assignment must be deployed.** Attempt to complete homework assignment as described in instructions. If unable to complete certain portions, please pseudocode these portions to describe what remains to be completed. Hosting on Heroku and adding a README.md are required for this homework. In addition, add this homework to your portfolio, more information can be found below. --- ### Hosting on Heroku Now that we have a backend to our applications, we use Heroku for hosting. Please note that while **Heroku is free**, it will request credit card information if you have more than 5 applications at a time or are adding a database. Please see [Heroku’s Account Verification Information](https://devcenter.heroku.com/articles/account-verification) for more details. --- ### Create a README.md Add a `README.md` to your repository describing the project. Here are some resources for creating your `README.md`. Here are some resources to help you along the way: * [About READMEs](https://help.github.com/articles/about-readmes/) * [Mastering Markdown](https://guides.github.com/features/mastering-markdown/) --- ### Add To Your Portfolio After completing the homework please add the piece to your portfolio. Make sure to add a link to your updated portfolio in the comments section of your homework so the TAs can easily ensure you completed this step when they are grading the assignment. To receive an 'A' on any assignment, you must link to it from your portfolio. --- ### One Last Thing If you have any questions about this project or the material we have covered, please post them in the community channels in slack so that your fellow developers can help you! If you're still having trouble, you can come to office hours for assistance from your instructor and TAs. That goes threefold for this unit: MongoDB and Mongoose compose a challenging data management system. If there's anything you find confusing about these technologies, don't hesitate to speak with someone from the Boot Camp team. **Good Luck!**
aakashdixit22
The backend API for the Talx Job Portal, built with Node.js and Express.js. It provides robust features like job management 💼, secure authentication 🔐, email subscriptions 📧, news integration 📰, contact form handling 📑, and job application management 📄. Powered by MongoDB, it ensures a seamless experience for job seekers and employers alike.
sajjathossainbd
The "Hiring Staff" backend repository handles secure authentication, role-based access control, job management, candidate-recruiter interactions, and advanced filtering, providing API endpoints to streamline the recruitment process efficiently.
Troth99
Job Board app: React frontend + Node.js backend with user authentication, job and company management, and theme switching.
karantaragi07
A backend system for skill-based user-job matching using Spring Boot and JPA. Implements authentication and user profile management.
Aspect022
A full-fledged Job Portal backend built with Spring Boot, featuring JWT & OAuth2 authentication, DTO-based architecture, and RESTful APIs for job seekers, employers, and job applications. Secured endpoints, and Spring Data JPA for efficient database management. Ready for frontend integration! 🚀
Rahul-7323
An application which implements flashcards for memory training. It has some core functionalities such as rest api, user login, dashboard, review, deck management, validations, server sent events, async deck exports, discord webhook for daily notifications, Redis and Celery for backend jobs, visually aesthetic and responsive UI using Tailwindcss.
abdullahmia
A scalable Pharmacy Management System backend built with Express, TypeScript, Prisma, and Redis. Includes JWT auth, RBAC, inventory modules, background jobs, Swagger docs, and Docker support — following modular monolithic architecture.
CODEMASTER-ABDULLAH-92
A modern job application platform built with the MERN stack (MongoDB, Express, React, Node.js). The app integrates Tailwind CSS and custom CSS for a sleek and responsive user interface. Efficient job posting and application management. Responsive design with Tailwind CSS and custom styling. Backend API with robust CRUD functionality.
PerezChris99
This robust and scalable Node.js API serves as a foundational backend solution for managing and distributing job opportunities. Designed with developers and recruitment platforms in mind, it provides a comprehensive suite of endpoints to handle the full lifecycle of job postings, from creation and management to dynamic searching and filtering
Leta-Kasahun
JobSphere is a full-featured job portal backend built with Spring Boot and PostgreSQL, supporting job seekers, employers, and admin management. It provides secure user management, job postings, applications, CV building, and job alerts in a scalable and maintainable architecture.
CaiquePirs
Job Vacancy Management API — Cloud Backend with Spring Boot & AWS
BuildMindX
A powerful Flask-based RESTful API backend for the JobHunt Pro job board application. Features comprehensive job management capabilities, automated job scraping, and advanced search functionality.
pallavi6311
A web-based Job Portal System built with Java Swing (frontend) and MySQL (backend). The system connects job seekers and employers by enabling job posting, job applications, and user profile management with secure authentication.
Aries-Surya
Full-stack job portal using React (Charaka UI) frontend, Django backend, and MongoDB. Features: user dashboards, job listings, applications management. Open-source and for educational purposes. Contributions welcome!
anuragsahu-dev
Financial management backend API built with Node.js, TypeScript, and PostgreSQL. Includes authentication, payments, background jobs, Docker deployment, and CI/CD.
HUST-AI-Recruitment
A backend system for a recruitment platform built with Go, Gin, MySQL, and JWT. Features include job and resume management, user authentication, and AI-powered resume screening and job matching.
echandsome
Node.js backend for web scraping and automation with Playwright, scheduled jobs, session management, and REST API. Uses Express, MySQL/Sequelize, and cron.
eddiemuhoro
A full-stack recruitment platform featuring AI-driven job recommendations, comprehensive admin management, and modern responsive design. Built with React/TypeScript frontends, FastAPI backend, and PostgreSQL database.
Anantram777
An Online Job Portal Website connects employers and job seekers on a single platform. Employers can post job openings, and candidates can search, apply, and track applications. Features include user authentication, role-based dashboards, and responsive design. Built with HTML, CSS, JavaScript, and optional backend for data management.
FarrukhKamalov
Jobbee API is a comprehensive backend system for managing job listings, applications, and user authentication. It provides a RESTful API that supports CRUD operations, user management, and secure authentication.
ayushmanayush
working on a backend-driven job application management system that tracks application lifecycle, schedules interview reminders, and provides analytics on success and rejection trends to help candidates improve their strategy.
98001yash
d This is the backend project for a real-world LinkedIn clone application built using microservices architecture. It provides core functionalities like user registration, profile management, messaging, and job listings.