Skip to Content
Docs are evolving — expect frequent updates.
CompanyCreate Analysis

Create Analysis

Generate a new analysis for an existing project. Use this endpoint to refresh analysis with updated data, re-analyze after uploading new files, or create additional analysis runs for the same company.

Endpoint

POST https://api.kruncher.ai/api/integration/analysis/company

Headers

HeaderRequiredDescription
AuthorizationYesYour API key (format: YOUR_API_KEY)
Content-TypeYesMust be application/json

Request Body

FieldTypeRequiredDescription
projectIdstringYesUnique identifier for the project to analyze

Use Cases

When to Create a New Analysis

  • Updated Information: Company has raised new funding or changed business model
  • New Documents: After uploading additional pitch decks or financial statements
  • Refresh Insights: Re-analyze with updated AI models or criteria
  • Comparative Analysis: Create multiple analysis snapshots over time
  • Data Changes: After updating company details or entities

Code Examples

JavaScript/TypeScript

CODE
const API_KEY = "YOUR_API_KEY_HERE";
 
const response = await fetch("https://api.kruncher.ai/api/integration/analysis/company", {
  method: "POST",
  headers: {
    "Authorization": `${API_KEY}`,
    "Content-Type": "application/json"
  },
  body: JSON.stringify({
    projectId: "521a93a6-091d-4943-ba13-7c1a654a14ae"
  })
});
 
const result = await response.json();
console.log("New analysis created:", result.analysisId);
console.log("Project:", result.project);

Result: Creates a new analysis and returns the analysis ID.

Python

CODE
import requests
 
API_KEY = "YOUR_API_KEY_HERE"
url = "https://api.kruncher.ai/api/integration/analysis/company"
 
headers = {
    "Authorization": f"{API_KEY}",
    "Content-Type": "application/json"
}
 
data = {
    "projectId": "521a93a6-091d-4943-ba13-7c1a654a14ae"
}
 
response = requests.post(url, headers=headers, json=data)
 
if response.status_code == 200:
    result = response.json()
    print(f"Analysis created: {result['analysisId']}")
    print(f"Project: {result['project']['name']}")
else:
    print(f"Error: {response.status_code} {response.text}")

Result: Creates a new analysis for the project.

cURL

CODE
curl -X POST "https://api.kruncher.ai/api/integration/analysis/company" \
  -H "Authorization: YOUR_API_KEY_HERE" \
  -H "Content-Type: application/json" \
  -d '{
    "projectId": "521a93a6-091d-4943-ba13-7c1a654a14ae"
  }'

Response

Success Response (200 OK)

CODE
{
  "code": "1000",
  "title": "Success",
  "description": "Analysis created successfully",
  "analysisId": "analysis_xyz789",
  "project": {
    "id": "521a93a6-091d-4943-ba13-7c1a654a14ae",
    "name": "TechStartup",
    "companyName": "TechStartup Inc.",
    "companyWebsite": "https://techstartup.com"
  }
}

Error Responses

400 Bad Request

CODE
{
  "code": "4000",
  "title": "Bad Request",
  "description": "Invalid project ID"
}

404 Not Found

CODE
{
  "code": "4040",
  "title": "Not Found",
  "description": "Project not found"
}

Common Workflows

Workflow 1: Refresh Analysis After Data Update

CODE
// After updating company data
await updateAnalysisEntity(analysisId, "revenue", 10000000);
await updateAnalysisEntity(analysisId, "employees", 75);
 
// Create new analysis to refresh insights
const newAnalysis = await manager.createAnalysis(projectId);
await manager.waitForAnalysis(newAnalysis.analysisId);
 
console.log("Analysis refreshed with new data");

Workflow 2: Analyze New Documents

CODE
# Upload new pitch deck and create fresh analysis
manager = KruncherAnalysisManager()
 
result = manager.create_analysis_with_files(
    project_id="521a93a6-091d-4943-ba13-7c1a654a14ae",
    file_paths=["Q4-2024-deck.pdf", "updated-metrics.xlsx"],
    wait=True
)
 
print(f"New analysis completed: {result['analysisId']}")

Workflow 3: Periodic Re-Analysis

CODE
import schedule
import time
 
def refresh_all_projects():
    """Refresh analysis for all active projects."""
    manager = KruncherAnalysisManager()
    
    # Get all projects
    response = manager.session.get(f"{manager.base_url}/integration/projects")
    projects = response.json()['data']
    
    for project in projects:
        if project.get('processing') == 'completed':
            print(f"Refreshing {project['name']}...")
            try:
                manager.create_analysis(project['id'])
            except Exception as e:
                print(f"Failed to refresh {project['name']}: {e}")
 
# Schedule weekly refresh
schedule.every().monday.at("09:00").do(refresh_all_projects)
 
while True:
    schedule.run_pending()
    time.sleep(3600)

Analysis Status

After creating an analysis, it will go through several status stages:

StatusDescription
pendingAnalysis queued, not started yet
progressAnalysis currently running
completedAnalysis finished successfully
failedAnalysis encountered an error

You can check status using the Find Companies endpoint and looking at the analyses array.

Best Practices

Check Before Creating

  • Verify project exists before creating analysis
  • Check if current analysis is still in progress
  • Consider rate limits for bulk analysis creation

File Uploads

  • Upload files immediately after creating analysis
  • Use the returned analysisId for file uploads
  • Don’t wait for first analysis to complete before uploading

Status Polling

  • Poll every 5-10 seconds for status updates
  • Implement timeout for long-running analyses
  • Handle failed analyses gracefully

Error Handling

  • Validate project ID exists
  • Check response status before proceeding
  • Implement retry logic for transient failures

Troubleshooting

”Project not found”

  • Verify the project ID is correct
  • Ensure project hasn’t been deleted
  • Check you have access to the project

Analysis Stuck in “progress”

  • Wait up to 5 minutes for completion
  • Check file upload completed successfully
  • Contact support if stuck longer than 10 minutes

Multiple Analyses

  • Each analysis creates a new record
  • Old analyses remain accessible
  • Use createdAt to identify latest analysis

Need Help?

  • Bulk analysis? Contact support for batch processing
  • Long processing? Check file sizes and formats
  • Integration issues? Review the workflow examples above
Last updated on