🤝 Contributing to NeuroLink
Thank you for your interest in contributing to NeuroLink! We welcome contributions from the community and are excited to work with you.
📋 Table of Contents
- Code of Conduct
- How to Contribute
- Development Setup
- Project Structure
- Coding Standards
- Testing Guidelines
- Pull Request Process
- Documentation
- Community
Code of Conduct
Please read and follow our Code of Conduct. We are committed to providing a welcoming and inclusive environment for all contributors.
How to Contribute
Reporting Issues
- Check existing issues - Before creating a new issue, check if it already exists
- Use issue templates - Use the appropriate template for bugs, features, or questions
- Provide details - Include reproduction steps, environment details, and expected behavior
Suggesting Features
- Open a discussion - Start with a GitHub Discussion to gather feedback
- Explain the use case - Help us understand why this feature would be valuable
- Consider alternatives - What workarounds exist today?
Contributing Code
- Fork the repository - Create your own fork of the project
- Create a feature branch -
git checkout -b feature/your-feature-name - Make your changes - Follow our coding standards
- Write tests - Ensure your changes are tested
- Submit a pull request - Follow our PR template
Development Setup
Prerequisites
- Node.js 18+ and pnpm 9+
- Git
- At least one AI provider API key (OpenAI, Google AI, etc.)
Local Development
# Clone your fork
git clone https://github.com/YOUR_USERNAME/neurolink.git
cd neurolink
# Install dependencies
pnpm install
# Set up environment variables
cp .env.example .env
# Edit .env with your API keys
# Build the project
pnpm run build
# Run tests
pnpm test
# Run linting
pnpm run lint
# Run type checking
pnpm run check
Running Examples
# Test CLI
npx tsx src/cli/index.ts generate "Hello world"
# Run example scripts
pnpm run example:basic
pnpm run example:streaming
# Start demo server
cd neurolink-demo && pnpm start
Project Structure
neurolink/
├── src/
│ ├── lib/
│ │ ├── core/ # Core types and base classes
│ │ ├── providers/ # AI provider implementations
│ │ ├── factories/ # Factory pattern implementation
│ │ ├── mcp/ # Model Context Protocol integration
│ │ └── sdk/ # SDK extensions and tools
│ └── cli/ # Command-line interface
├── docs/ # Documentation
├── test/ # Test files
├── examples/ # Example usage
└── scripts/ # Build and utility scripts
Key Components
- BaseProvider - Abstract base class all providers inherit from
- ProviderRegistry - Central registry for provider management
- CompatibilityFactory - Handles provider creation and compatibility
- MCP Integration - Built-in and external tool support
Coding Standards
TypeScript Style Guide
// ✅ Good: Clear interfaces with documentation
type GenerateOptions = {
/** The input text to process */
input: { text: string };
/** Temperature for randomness (0-1) */
temperature?: number;
/** Maximum tokens to generate */
maxTokens?: number;
};
// ✅ Good: Proper error handling
async function generate(options: GenerateOptions): Promise<GenerateResult> {
try {
// Implementation
} catch (error) {
throw new NeuroLinkError("Generation failed", { cause: error });
}
}
// ❌ Bad: Avoid any types
function process(data: any) {
// Use specific types instead
// Implementation
}
Best Practices
- Use the factory pattern - All providers should extend BaseProvider
- Type everything - No implicit
anytypes - Handle errors gracefully - Use try-catch and provide meaningful errors
- Document public APIs - Use JSDoc comments for all public methods
- Keep functions small - Single responsibility principle
- Write tests first - TDD approach encouraged
Naming Conventions
- Files:
kebab-case.ts(e.g.,baseProvider.ts) - Classes:
PascalCase(e.g.,OpenAIProvider) - Interfaces:
PascalCase(e.g.,GenerateOptions) - Functions:
camelCase(e.g.,createProvider) - Constants:
UPPER_SNAKE_CASE(e.g.,DEFAULT_TIMEOUT)
Testing Guidelines
Test Structure
import { describe, it, expect } from "vitest";
import { OpenAIProvider } from "../src/providers/openai";
describe("OpenAIProvider", () => {
describe("generate", () => {
it("should generate text with valid options", async () => {
const provider = new OpenAIProvider();
const result = await provider.generate({
input: { text: "Hello" },
maxTokens: 10,
});
expect(result.content).toBeDefined();
expect(result.content.length).toBeGreaterThan(0);
});
it("should handle errors gracefully", async () => {
// Test error scenarios
});
});
});
Testing Requirements
- Unit tests - For all public methods
- Integration tests - For provider interactions
- Mock external calls - Don't hit real APIs in tests
- Test edge cases - Empty inputs, timeouts, errors
- Maintain coverage - Aim for >80% code coverage
Running Tests
# Run all tests
pnpm test
# Run tests in watch mode
pnpm run test:watch
# Run with coverage
pnpm run test:coverage
# Run specific test file
pnpm test:providers
Pull Request Process
Before Submitting
- Update documentation - Keep docs in sync with code changes
- Add tests - New features need tests
- Run checks -
pnpm run lint && pnpm run check && pnpm test - Update CHANGELOG - Add your changes under "Unreleased"
PR Template
## Description
Brief description of changes
## Type of Change
- [ ] Bug fix
- [ ] New feature
- [ ] Breaking change
- [ ] Documentation update
## Testing
- [ ] Tests pass locally
- [ ] Added new tests
- [ ] Updated documentation
## Related Issues
Fixes #123
Review Process
- Automated checks - CI/CD must pass
- Code review - At least one maintainer approval
- Documentation review - Docs team review if needed
- Testing - Manual testing for significant changes
Documentation
Documentation Standards
- Keep it current - Update docs with code changes
- Show examples - Every feature needs examples
- Explain why - Not just what, but why
- Test code snippets - Ensure examples actually work
- Update the matrix - Mark coverage in
docs/tracking/FEATURE-DOC-MATRIX.mdwhen new user-facing work lands.
Documentation Structure
- API Reference - Generated from TypeScript types
- Guides - Step-by-step tutorials
- Examples - Working code samples
- Architecture - System design documentation
Writing Documentation
# Feature Name
## Overview
Brief description of what this feature does and why it's useful.
## Usage
\```typescript
// Clear, working example
const result = await provider.generate({
input: { text: "Example prompt" },
temperature: 0.7
});
\```
## API Reference
Detailed parameter descriptions and return types.
## Best Practices
Tips for effective usage.
## Common Issues
Known gotchas and solutions.
Community
Getting Help
- GitHub Discussions - Ask questions and share ideas
- Issues - Report bugs and request features
- Discord - Community chat is planned for the future
Ways to Contribute
- Code - Fix bugs, add features
- Documentation - Improve guides and examples
- Testing - Add test coverage
- Design - UI/UX improvements
- Community - Help others, answer questions
Recognition
We value all contributions! Contributors are:
- Listed in our Contributors page
- Mentioned in release notes
- Given credit in the changelog
🎯 Current Focus Areas
We're particularly interested in contributions for:
- Provider Support - Adding new AI providers
- Tool Integration - MCP external server activation
- Performance - Optimization and benchmarking
- Documentation - Tutorials and guides
- Testing - Increasing test coverage
📝 License
By contributing to NeuroLink, you agree that your contributions will be licensed under the MIT License.
Thank you for contributing to NeuroLink! 🚀