Standardized Open-Source Workflow Guidelines for Python & JavaScript Projects
Open-source repositories must start with a clear, consistent structure and naming scheme. For Python, use a top-level README
, LICENSE
, and a setup.py
or pyproject.toml
, plus a dedicated source package directory. For example, Kenneth Reitz’s recommended layout includes setup.py
, requirements.txt
, your package folder (e.g. ./mypkg/
containing __init__.py
), plus docs/ and tests/ directories. Package and module names should follow PEP 8: short, all-lowercase for modules (underscores optional if needed), and CapWords for classes. Use virtual environments (venv
) and pip to install dependencies in development.
For JavaScript/Node, a typical repo has package.json
, lock files (package-lock.json
or yarn.lock
), and a src/
(or lib/
) directory for code, plus test/
for tests. Filenames should be lowercase, with dashes or underscores (no spaces or uppercase) according to many style guides. Use const/let
(not var
), camelCase for variables/functions, and PascalCase for constructors/ classes. Name files to match their default exports (e.g. MyClass.js
exporting MyClass
). Include a .gitignore
and .npmrc
or .yarnrc.yml
as needed. In both Python and JS repos, consider a “src”-layout (code in src/
) if using a build step, but many libraries place the package folder at root for simplicity.
-
Folder Structure (Python):
-
Repository root:
README.md
,LICENSE
,setup.py
orpyproject.toml
,requirements.txt
orpyproject.toml
with build-system , etc. -
Source code: a package directory (
./mypkg/
) containing an__init__.py
. If it’s a singlemodule library, you can putmypkg.py
at root. -
Tests: a
tests/
folder with files liketest_*.py
. -
Docs/examples: optional
docs/
andexamples/
directories for documentation and usage demos. -
Folder Structure (JS/Node):
-
Repository root:
README.md
,LICENSE
,package.json
, lock file (package-lock.json
oryarn.lock
). -
Source:
src/
(orlib/
) directory for the main code (especially if transpiled). -
Tests:
test/
or__tests__/
with test files (.spec.js
or.test.js
). -
Docs: optional
docs/
andexamples/
. -
Configuration: include linter configs (
.eslintrc.json
,.prettierrc
), Babel/TypeScript config if needed.
Use consistent naming conventions (PEP 8 for Python, AirBnB/Google style for JS). Document conventions in a style or contributing guide so contributors know the standard.
Python Package Management (pip, Poetry)
Manage Python dependencies with either traditional requirements.txt
/ setup.py
or modern tools like Poetry. Always generate a lock file ( requirements.lock
or poetry.lock
) to pin exact versions. For pip, use pip freeze > requirements.txt
(or better, use pip-tools for deterministic requirements). For Poetry (recommended for libraries), list core dependencies under [tool.poetry.dependencies]
in pyproject.toml
and use dependency groups for extras. Poetry natively supports groups (e.g. a test
or docs
group) for dev dependencies. Example:
[tool.poetry]
name = "mypkg"
version = "0.1.0"
[tool.poetry.dependencies]
requests = "^2.0"
[tool.poetry.group.dev.dependencies]
pytest = "^7.0"
pytest-mock = "*"
This keeps test-only deps separate. Run poetry install
to recreate the environment (incl. groups via --with dev
).
Lock and commit poetry.lock (or requirements.txt
/ Pipfile.lock
). In CI, use pip install -r requirements.txt
or poetry install --no-root
to install deps. Automate publishing: a typical GitHub Actions pipeline triggers on a tagged release and runs something like
poetry build
then poetry publish --username __token__ -r pypi
(Poetry can use a PyPI API token stored as PYPI_API_TOKEN
). The GitHub official docs show using pypa/gh-action-pypi-publish
for uploading built distributions. Ensure pyproject.toml
has the proper metadata (Project name, version, authors, license) as per PEP 621.
Key Points: Use virtualenv/venv and pip
during development. Prefer Poetry or pip-tools for reproducible builds and grouping (dev vs main). Automate releases: on tag push or GitHub Release, build wheels/sdist and push to PyPI via CI. Keep dependencies updated (see Security below).
JavaScript Package Management (npm, Yarn)
Use npm
(or Yarn) to manage JS packages. Your package.json
defines metadata (name, version, scripts, dependencies, devDependencies). Commit your lockfile (package-lock.json
or yarn.lock
) to ensure reproducible installs. For multi-package repos, consider Workspaces: npm (v7+) and Yarn support workspaces for monorepos. Yarn Workspaces let you define "workspaces": ["packages/*"]
in package.json
, hoist shared deps, and link local packages. Npm similarly uses a top-level workspaces
field (see npm docs). In a monorepo, keep a single top-level package.json
and each sub-package under e.g. packages/
with its own package.json
. Tools like Lerna, pnpm, or Rush can also manage multi-package JS repos.
Publishing: use npm publish
(npm) or yarn publish
(Yarn) to upload to npm registry (ensure package.json
has a unique name and updated version). You can automate npm publishing via GitHub Actions (e.g. on release tags). For example, a workflow can use actions/setup-node
, npm install
, npm test
, then npm version
and npm publish --access public
. GitHub also offers an official Publishing Node.js packages guide. For private packages, set up .npmrc
registry and tokens.
Key Points: Define dependencies vs devDependencies properly. Use workspace tools for multi-package repos. Pin versions via lockfiles. Automate npm publish
in CI after CI passes. Include files or .npmignore
to control published content. Use semantic versioning in package.json
.
Testing Frameworks
Automate robust testing in every repo. For Python, use pytest (widely used) or unittest
. Write tests in a tests/
directory (names test_*.py
). Use fixtures (e.g. pytest-mock
) and unittest.mock
for isolation. Measure coverage with coverage.py (use pytest --cov=my_pkg
or integrate coverage
package). Tools like Codecov can upload coverage reports from CI. The pytest docs recommend installing your package in editable mode ( pip install -e .
) so tests import the code. Structure tests clearly, separate unit vs integration as needed (pytest marks or folders).
For JavaScript, use Jest (popular, includes assertions and mocking) or Vitest. Place tests under __tests__/
or test/
. Use mocks (Jest has built-in mocking; Sinon for other frameworks). Generate coverage via built-in Jest support (jest --coverage
), or nyc
(Istanbul). Store coverage reports as CI artifacts or send to Codecov. Ensure CI fails if coverage drops below threshold.
- Python:
pytest
withpytest-cov
,unittest
, ornose
/tox
as needed. Usepytest.ini
orpyproject.toml
to configure. - JS:
jest
(ormocha
, though Jest is most common), or Vitest in Vite projects. - Mocking: In Python, use
unittest.mock
or pytest fixtures; in JS use Jest mocks or Sinon. - CI Integration: Always run
pytest
/jest
in CI workflow. E.g., GitHub Actionsrun: pytest --maxfail=1 --disable-warnings -q
andrun: jest --ci
.
Include tests in pull request CI checks and require passing to merge. Generate HTML reports or badges for visibility.
Static Analysis & Type Checking
Enforce code quality with linters and type checkers. For Python, common tools are flake8, pylint, or ruff for linting and mypy or Pyright for type checking. Use Black for code formatting (no configuration, just run black .
), and isort for import sorting. For example, a CI step can run flake8 .
(or ruff .
), and mypy ./my_pkg
to catch errors. Integrate these in CI so style issues fail the build. Many IDEs (VSCode, PyCharm) have plugins for these to lint as you code.
For JavaScript, use ESLint with a shared style config (AirBnB or Google). Run eslint .
in CI. Use Prettier for consistent formatting (it can be integrated via --fix
or a pre-commit hook). If using TypeScript, enable strict
mode in tsconfig.json
and run tsc --noEmit
to type-check. Otherwise, use JSDoc or type-aware linters. Combine ESLint and Prettier (with eslint-plugin-prettier) to enforce both style and static rules.
Key Points: Include lint/type steps in CI. E.g. runs-on: ubuntu-latest
with steps pip install flake8 mypy; flake8 .; mypy .
or npm install eslint prettier; eslint --max-warnings 0 .
. Enforce that all code meets the style guide (PEP8/ESLint rules) and is type-sound. Consider auto-fixing (e.g. eslint --fix
, isort .
, black .
).
Versioning, Commit Messages & Changelogs
Adopt Semantic Versioning (MAJOR.MINOR.PATCH). Enforce Conventional Commits: every commit message starts with a type (feat:
, fix:
, docs:
, etc.) and optional scope. For example: feat(parser): add support for new syntax
or fix(ui): prevent button crash
. This enables auto-generated changelogs and automated version bumps. Tools like standard-version or semantic-release use commit history to determine the next version (e.g. bump minor for feat
or major if BREAKING CHANGE
appears).
Maintain a CHANGELOG.md (often generated with Conventional Changelog) that lists releases and changes. For example, semantic-release can auto-create GitHub releases with notes from commits. Use Git tags for releases. In repos, you might run standard-version
in CI after merge to main
to update version, changelog, and create a Git tag automatically.
Commit Messages: Use Conventional Commits format. Include a brief description after the type.
Changelogs: Keep a human-readable log. Tools like conventional-changelog
or semanticrelease can auto-generate it. This helps contributors see history.
Version Tags: Tag releases in git (e.g. v1.2.3
) so CI/CD can trigger publishing.
CI/CD Workflows
Automate building, testing, and deployment with CI. GitHub Actions is recommended for cloud repos: use workflow YAML files in .github/workflows/
. There are templates for Python and Node CI. Each workflow should run on pushes (or PRs) to main branches and perform steps like:
- Checkout code:
uses: actions/checkout@vX
. - Setup runtime:
actions/setup-python@vX
oractions/setup-node@vX
with specified versions. - Install dependencies:
pip install -r requirements.txt
orpoetry install
, andnpm ci
oryarn install
. - Run linters/types: e.g.
flake8 .
,mypy .
,eslint .
(fail on errors). - Run tests:
pytest --maxfail=1
/jest --ci
. Collect coverage. - Build/package: For Python,
python -m
build orpoetry build
. For Node,npm run build
or compile TS. - Publish (on release): Trigger on
on: release
(type: published) to deploy to PyPI or npm. Use secrets for PyPI or npm tokens. - Notifications: optionally notify Slack/email on failure or success.
You can also use CircleCI or GitLab CI similarly. The key is making CI mandatory for merges (see Branch Protection below). Also, cache dependencies to speed up builds. Use CI badges in README to show build status and coverage.
Task Automation (Makefile / Scripts)
Define reusable commands via a Makefile (for Python) or npm scripts (for JS). For example, a Makefile
might have:
.PHONY: install lint test
install:
pip install -r requirements.txt
lint:
flake8 mypkg
mypy mypkg
test:
pytest --cov=mypkg
This provides a consistent interface: make test
always runs tests, etc. Similarly, in package.json
add:
"scripts": {
"lint": "eslint src/**/*.js",
"test": "jest --coverage",
"build": "tsc -p ."
}
so contributors run npm run lint
, npm test
, etc. Document these scripts in README. This abstraction helps contributors (especially cross-language projects) use the same commands.
Dockerization & Local Dev
Provide a Dockerfile for reproducible builds and local development. Follow Docker best practices: use multi-stage builds to keep images small (build in one stage, copy only needed artifacts to final image). Choose minimal base images (e.g. python:3.11-slim
or node:20-alpine
as official, slim variants). Include a .dockerignore
to exclude unnecessary files. For Python, consider using pip install --no-cache-dir -r requirements.txt
. For JS, npm ci --only=production
.
Also provide a docker-compose.yml
if the project has services (e.g. a web app + database) to simplify local development. This ensures anyone can docker compose up
and have a consistent environment. Document Docker commands (build/run) in README.
Contribution Guidelines (README, CONTRIBUTING, CoC)
Encourage public contributions by adding standard community files. At minimum include:
-
README.md: Must contain a project overview, installation instructions, usage examples, links to docs, license, and how to get help. Use badges (build, coverage, PyPI/npm version) for quick info.
For example: -
Title and short description.
-
Badge for build status (e.g. CI).
-
Installation (pip install or npm i).
-
Basic usage code snippets.
-
Links to docs or tutorial.
-
License section.
-
CONTRIBUTING.md: Outline how to contribute (development setup, coding style, commit conventions, testing). GitHub offers a contributing template. For example, instruct contributors to fork repo, create a feature branch, write tests, run linters, make commits with conventional messages. Link this guide prominently (e.g. “Read CONTRIBUTING.md for how to submit patches.”). A clear contributing guide makes PRs smoother.
-
CODE_OF_CONDUCT.md: Adopt a standard code of conduct (e.g. Contributor Covenant) to set community expectations. Link to it from CONTRIBUTING.md or README (e.g. “Please note we have a Code of Conduct in
CODE_OF_CONDUCT.md
.”). -
Codeowners: A
CODEOWNERS
file in the repo root (or.github/
) can automatically request reviews from experts on particular files. For example:
* @global-maintainer
*.py @python-maintainer
*.js @js-maintainer
GitHub will assign these users/teams as reviewers on PRs touching matching files. This enforces accountability.
Issue & PR Templates, Labels, and Triage
Standardize issues and pull requests. Use GitHub issue templates and pull request templates (under .github/ISSUE_TEMPLATE/
and .github/PULL_REQUEST_TEMPLATE/
) so contributors see prompts for necessary info. For example, a bug issue template might ask for steps to reproduce, expected vs actual. A PR template can include a checklist (e.g. “Code builds? Tests added? Docs updated?”) and link to CONTRIBUTING rules. This helps maintainers triage effectively.
Use meaningful labels on issues/PRs (GitHub labels) to categorize work: e.g. bug
, enhancement
, documentation
, help wanted
, good first issue
. Having a standard label set (which GitHub can auto-populate from .github/labels.yml
in some setups) makes it easier for maintainers and newcomers. Actively triage new issues: tag appropriate people, close duplicates, and classify priority.
Branch Protection & Review Workflow
Protect important branches (e.g. main
or master
) via GitHub’s Branch Protection Rules. Require pull requests (no direct pushes) and enable required status checks (CI tests, linters) before merge. Enforce at least one approving review (or 2 for larger teams). This ensures all code is reviewed and validated. For large teams, use protected branch patterns (e.g. release/*
).
Enable required reviews from code owners: check “Require review from Code Owners” in branch protection. Then GitHub will block merging until owners (from CODEOWNERS
) approve. Also consider requiring Signed Commits or enforcing DCO for contributions, if needed.
Clearly document the branching strategy (e.g. GitHub Flow, trunk-based development with main
, or Gitflow) and communicate it in CONTRIBUTING.md. E.g., every feature goes in a topic branch, PRs are merged only via button in GH after passing checks.
Security Best Practices
Automate dependency security scanning. Dependabot (built into GitHub) can create PRs to update vulnerable npm or pip dependencies. Enable it to scan weekly and auto-merge trivial updates if possible. For npm projects, run npm audit
(or use npm audit-ci
in CI) to fail builds on high vulnerabilities. For Python, use Safety (formerly PyUp) or pip-audit
to check for CVEs. Integrate SAST tools (like Bandit for Python or ESLint security rules for JS) in CI for static analysis of code vulnerabilities.
Follow the GitHub “best practices for maintaining dependencies”: use lock files to pin versions, regularly update them, and automate security patches . Subscribe to security advisories of used libraries/frameworks. Keep the Docker base images up-to-date with security patches. Do not commit secrets or credentials to the repo; use GitHub Secrets for CI tokens
Monorepo vs. Polyrepo Patterns
Decide between a single repository (monorepo) or multiple repos. Each has trade-offs:
-
Monorepo: One repo contains many projects/packages. Pros: easier code sharing, consistent tooling/CI, and single versioning. All code is visible (improves discoverability and context) . With workspaces (Yarn/npm/Lerna), dependency duplication is reduced. CI can run selective builds (only changed packages). Cons: larger repo size, complex CI (build/test can slow down). Requires strict discipline in testing, reviews, and architecture.
-
Polyrepo: Each project has its own repo. Pros: smaller codebases, independent lifecycles, simpler CI per repo. Teams have autonomy (each service or library isolated). Cons: more repositories to manage (can multiply efforts), harder cross-cutting changes (one change in many repos), possible code duplication. Requires strong team governance to keep standards aligned across repos.
Choose based on team size and project coupling. The blueprint’s best practices apply in either model, but in monorepos leverage workspace tooling and in polyrepos use templates and automation to enforce consistency.