Roadmap

Our initial roadmap spans 3 years and is split into three overarching phases, each meticulously planned for timely product development and deployment. This approach allows stakeholders to access new elements within our ecosystem in parallel to our ongoing R&D efforts.

Phase 1 (Foundation): Setup and Initial Release

Objective: Establish the foundational technology and demonstrate capabilities.

  1. Q1 2024: Preliminary Setup

  • Setup head nodes in strategic geographic locations to optimize network performance and ensure data compliance.

  1. Q2 2024: Development and Testing

  • Release Cerebrum Cloud Beta to selected beta testers.

  1. Q3 2024: Onboarding and PoC

  • Establish an onboarding process for new users with CUDA-compatible devices.

  • Showcase a Proof of Concept for parallel task distribution to demonstrate the system's efficiency and effectiveness.

  1. Q4 2024: Network Expansion and Validator Initiation

  • Introduce our first network validators for mobile and desktop devices to enhance network security and integrity.

  • Begin network expansion phase, scaling up based on feedback and initial performance metrics.

Phase 2 (Expansion): Scaling and Diversification

Objective: Expand the reach and versatility of the network.

  1. Q1 2025: Full-Scale Deployment

  • Officially launch Cerebrum Cloud and Cerebrum Node, transitioning from beta to full deployment.

  • Introduce Cerebrum Agent for Beta Testing including Cerebrum Digital Twin.

  1. Q2 2025: Inclusivity and Accessibility

  • Expand Cerebrum Node to support non-CUDA architecture, making the network accessible to more users.

  • Integrate HuggingFace portal onto the Cerebrum Dashboard for public access.

  1. Q3 2025: Infrastructure Enhancement

  • Develop and deploy a custom load balancer for efficient task distribution across the network.

Phase 3 (Distribution): Advanced Features and Expansion

Objective: Innovate and extend the network’s capabilities.

  1. Q1 2026: Version Upgrade

  • Release v2.0 for Cerebrum Node with enhanced features and performance optimizations, including the new load balancer.

  1. Q2 2026: Mobile Integration

  • Begin including mobile devices as nodes for handling CPU-intensive data tasks.

  1. Q3 2026: Enhanced Task Distribution

  • Upgrade the task distribution system to support both CPU and GPU nodes, allowing more flexible and efficient resource usage.

Last updated