Templates
Templates in Grepr provide a powerful way to create reusable pipeline components using JavaScript. They enable you to define parameterized job subgraphs that can be used across multiple pipelines, promoting consistency and reducing the operational burden of updating multiple pipelines in similar ways.
A template is a JavaScript script that generates a Grepr job graph dynamically based on input parameters. Templates can include conditional logic, loops, and complex data transformations to create sophisticated pipeline components that adapt to different use cases.
How Templates Work
Templates work by executing JavaScript code in a secure sandboxed environment that returns a valid Grepr job graph. The template system follows this process:
- Input Validation: Validates input parameters against a JSON schema to ensure type safety
- Script Execution: Executes the JavaScript template in a secure, isolated environment
- Graph Generation: The script returns a job graph that gets integrated into the parent pipeline
- Graph Resolution: Resolves nested templates and connects them to the parent pipeline
- Pipeline Integration: The generated subgraph replaces the template operation in the pipeline
Templates can be used in two primary ways:
- Development and Testing: Use the test API endpoint to validate template logic during development
- Production Pipelines: Integrate templates into job graphs using the
TemplateOperation
Template Resolution Process
When a pipeline containing template operations is executed, Grepr:
- Identifies template operations in the job graph
- Fetches template definitions from the repository (latest version or specified version)
- Validates input parameters against the template's input schema
- Executes the JavaScript with the provided inputs and user context
- Generates the subgraph from the script output
- Resolves nested templates recursively (up to 10 levels deep)
- Connects inputs/outputs based on the template's input/output mappings
- Renames operations to avoid conflicts in the merged graph
Template Structure
A template consists of several key components that work together to define a reusable pipeline component:
Core Components
- Name and Description: Human-readable metadata for template identification
- Input Schema: JSON schema defining expected parameters and their types
- JavaScript Template: The script that generates the job graph
- Input Mappings: How external connections map to internal operations
- Output Mappings: How internal operations expose outputs to external consumers
- Draft Outputs: Special outputs available only when running in draft mode
Template Schema Definition
{
"name": "Log Transform Template",
"description": "Adds configurable log transformations with filtering",
"inputSchema": {
"type": "object",
"properties": {
"tagKey": {
"type": "string",
"description": "The tag key to modify in log messages"
},
"threshold": {
"type": "number",
"description": "Numeric threshold for filtering",
"minimum": 0,
"maximum": 1000
},
"enableDebug": {
"type": "boolean",
"description": "Enable debug logging",
"default": false
}
},
"required": ["tagKey"]
},
"template": "<!-- JavaScript code here -->",
"inputs": {
"logEvents": "transform1:input",
"config": "transform1:config"
},
"outputs": {
"processedEvents": "transform2:output",
"metrics": "metrics:output"
},
"draftOutputs": [
{
"name": "debugOutput",
"description": "Debug information",
"vertex": "debug"
}
]
}
JavaScript Template Guide
Templates are written in JavaScript and must return a valid GreprJobGraph
object. The JavaScript environment provides
access to input parameters, user context, and various utility functions.
Available Variables
Templates have access to several built-in variables:
- Input Parameters: All parameters defined in
inputSchema
are available as variables draft
: Boolean indicating if the template is running in draft modeuser
: Object containing information about the current user (e.g.,user.getName()
)
Basic Template Example
// Simple template that creates a log transformation
const graph = {
"vertices": [
{
"type": "log-transform",
"name": "transform1",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": tagKey,
"values": [`processed-${threshold}`]
}
]
},
{
"type": "log-transform",
"name": "transform2",
"transforms": [
{
"type": "tag-action",
"order": 1,
"modification": "SET",
"tagKey": "threshold",
"values": [threshold.toString()]
}
]
}
],
"edges": ["transform1 -> transform2"]
};
// Don't use a "return" statement; just output the object or its json directly
graph;
Template with Conditional Logic
// Template that adapts based on input parameters and draft mode
const result = {
"vertices": [
{
"type": "log-transform",
"name": "transform1",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": tagKey,
"values": [`processed-${threshold}`]
}
]
},
{
"type": "log-transform",
"name": "transform2",
"transforms": [
{
"type": "tag-action",
"order": 1,
"modification": "SET",
"tagKey": "processing_threshold",
"values": [threshold.toString()]
}
]
}
],
"edges": ["transform1 -> transform2"]
};
// Add filtering based on threshold
if (threshold > 50) {
result.vertices.push({
"type": "logs-filter",
"name": "highThresholdFilter",
"query": {
"type": "simple-query",
"query": `threshold:>${threshold}`
}
});
result.edges.push("transform2 -> highThresholdFilter");
}
// Add debug vertex in draft mode
if (draft) {
result.vertices.push({
"type": "log-transform",
"name": "debug",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": "debug",
"values": ["true"]
}
]
});
result.edges.push("transform1 -> debug");
}
// Add user context
result.vertices.push({
"type": "log-transform",
"name": "userContext",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": "created_by",
"values": [user.getName()]
}
]
});
result.edges.push("transform2 -> userContext");
result;
Template with Dynamic Operations
// Template that creates operations based on input arrays
const operations = [];
const edges = [];
// Create transforms based on input configuration
const transformConfigs = JSON.parse(transformationsJson);
for (let i = 0; i < transformConfigs.length; i++) {
const config = transformConfigs[i];
const opName = `transform${i + 1}`;
operations.push({
"type": "log-transform",
"name": opName,
"transforms": [
{
"type": "tag-action",
"order": i,
"modification": config.modification || "ADD",
"tagKey": config.tagKey,
"values": config.values || ["default"]
}
]
});
// Connect to previous operation
if (i > 0) {
edges.push(`transform${i} -> ${opName}`);
}
}
// Add a filter if enabled
if (enableFiltering) {
operations.push({
"type": "logs-filter",
"name": "finalFilter",
"query": {
"type": "simple-query",
"query": "processed:true"
}
});
// Connect last transform to filter
if (operations.length > 1) {
edges.push(`transform${transformConfigs.length} -> finalFilter`);
}
}
({
"vertices": operations,
"edges": edges
});
Error Handling in Templates
Templates should validate inputs and throw meaningful errors when validation fails. Do not return a valid pipeline with invalid input - this could result in a production pipeline running with incorrect configuration.
// Template with proper error handling and validation
// Validate required parameters
if (!tagKey || tagKey.trim() === '') {
throw new Error('tagKey cannot be empty');
}
if (threshold < 0 || threshold > 1000) {
throw new Error('threshold must be between 0 and 1000');
}
// Validate that transformations array exists and has valid structure
if (!Array.isArray(transformations) || transformations.length === 0) {
throw new Error('transformations must be a non-empty array');
}
for (let i = 0; i < transformations.length; i++) {
const config = transformations[i];
if (!config.tagKey || !config.operation) {
throw new Error(`Invalid transformation at index ${i}: missing tagKey or operation`);
}
if (!['ADD', 'SET', 'REMOVE'].includes(config.operation)) {
throw new Error(`Invalid operation "${config.operation}" at index ${i}. Must be ADD, SET, or REMOVE`);
}
}
// If validation passes, generate the pipeline
const result = {
"vertices": [
{
"type": "log-transform",
"name": "processor",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "SET",
"tagKey": tagKey,
"values": [threshold.toString()]
}
]
}
],
"edges": []
};
result;
Template API Reference
The Templates API provides comprehensive lifecycle management for templates. All endpoints are available under
/v1/templates
and require authentication.
Creating Templates
Create a new template using the POST /v1/templates
endpoint:
curl -X POST "${GREPR_API_URL}/v1/templates" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"name": "Log Transform Template",
"description": "Adds configurable log transformations with conditional logic",
"inputSchema": {
"type": "object",
"properties": {
"tagKey": {
"type": "string",
"description": "Tag key to modify in log messages"
},
"threshold": {
"type": "number",
"description": "Numeric threshold for processing",
"minimum": 0,
"maximum": 1000
},
"enableDebug": {
"type": "boolean",
"description": "Enable debug outputs",
"default": false
}
},
"required": ["tagKey"]
},
"template": "const result = { vertices: [{ type: 'log-transform', name: 'transform1', transforms: [{ type: 'tag-action', order: 0, modification: 'ADD', tagKey: tagKey, values: [`processed-${threshold}`] }] }], edges: [] }; if (enableDebug) { result.vertices.push({ type: 'log-transform', name: 'debug', transforms: [{ type: 'tag-action', order: 0, modification: 'ADD', tagKey: 'debug', values: ['true'] }] }); result.edges.push('transform1 -> debug'); } result;",
"inputs": {
"logEvents": "transform1:input"
},
"outputs": {
"processedEvents": "transform1:output"
},
"draftOutputs": [
{
"name": "debugOutput",
"description": "Debug information",
"vertex": "debug"
}
]
}'
Listing Templates
Get all templates you have access to:
curl -X GET "${GREPR_API_URL}/v1/templates" \
-H "Authorization: Bearer ${JWT_TOKEN}"
Response includes metadata for each template:
{
"items": [
{
"id": "template-12345",
"name": "Log Transform Template",
"description": "Adds configurable log transformations",
"version": 3,
"organizationId": "my-org",
"createdAt": "2024-01-15T10:30:00Z",
"updatedAt": "2024-01-16T14:20:00Z"
}
]
}
Retrieving Templates
Get a specific template by ID with various options:
# Get latest version
curl -X GET "${GREPR_API_URL}/v1/templates/template-12345?latest=true" \
-H "Authorization: Bearer ${JWT_TOKEN}"
# Get specific version
curl -X GET "${GREPR_API_URL}/v1/templates/template-12345?version=2" \
-H "Authorization: Bearer ${JWT_TOKEN}"
# Get all versions
curl -X GET "${GREPR_API_URL}/v1/templates/template-12345" \
-H "Authorization: Bearer ${JWT_TOKEN}"
Updating Templates
Update a template, which creates a new version:
curl -X PUT "${GREPR_API_URL}/v1/templates/template-12345" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"name": "Enhanced Log Transform Template",
"description": "Updated template with improved error handling",
"version": 4,
"inputSchema": {
"type": "object",
"properties": {
"tagKey": {
"type": "string",
"description": "Tag key to modify"
},
"threshold": {
"type": "number",
"description": "Processing threshold",
"minimum": 0,
"maximum": 1000
},
"processingMode": {
"type": "string",
"description": "Processing mode",
"enum": ["strict", "permissive"],
"default": "permissive"
}
},
"required": ["tagKey", "threshold"]
},
"template": "updated JavaScript content with new processingMode logic"
}'
Deleting Templates
Delete a template and all its versions:
curl -X DELETE "${GREPR_API_URL}/v1/templates/template-12345" \
-H "Authorization: Bearer ${JWT_TOKEN}"
⚠️ Warning: This is a dangerous operation. You can delete templates that are currently in use by active jobs. If you delete a template that is being used by a pipeline, that pipeline will not be able to restart or be updated since it won't find the template again. Always ensure templates are not in use before deletion.
Testing Templates
The template test endpoint is crucial for development and validation. It allows you to execute template scripts with provided inputs and validate the output before deploying to production pipelines.
Test Template Endpoint
Use POST /v1/templates/test
to execute a template script:
curl -X POST "${GREPR_API_URL}/v1/templates/test" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"template": {
"name": "Test Template",
"description": "Template for testing purposes",
"inputSchema": {
"type": "object",
"properties": {
"tagKey": {
"type": "string",
"description": "Tag key to process"
},
"threshold": {
"type": "number",
"description": "Processing threshold",
"minimum": 0
}
},
"required": ["tagKey"]
},
"template": "({ vertices: [{ type: 'log-transform', name: 'transform1', transforms: [{ type: 'tag-action', order: 0, modification: 'ADD', tagKey: tagKey, values: [`processed-${threshold || 0}`] }] }], edges: [] })"
},
"input": {
"isDraft": false,
"values": {
"tagKey": "environment",
"threshold": 42
}
},
"checkValidGreprJobGraph": true
}'
Test Request Parameters
template
: The complete template object to testinput.isDraft
: Whether to run in draft mode (enables draft outputs)input.values
: Input values for the template parameterscheckValidGreprJobGraph
: If true, validates that the result is a valid GreprJobGraph
Test Response Examples
Successful execution:
{
"vertices": [
{
"type": "log-transform",
"name": "transform1",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": "environment",
"values": ["processed-42"]
}
]
}
],
"edges": []
}
Error response:
{
"error": "JavaScript execution failed: tagKey is not defined"
}
Advanced Testing Examples
Testing with draft mode:
curl -X POST "${GREPR_API_URL}/v1/templates/test" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"template": {
"name": "Draft-Aware Template",
"inputSchema": {
"type": "object",
"properties": {
"serviceName": { "type": "string" }
},
"required": ["serviceName"]
},
"template": "const result = { vertices: [{ type: 'log-transform', name: 'service', transforms: [{ type: 'tag-action', order: 0, modification: 'ADD', tagKey: 'service', values: [serviceName] }] }], edges: [] }; if (draft) { result.vertices.push({ type: 'log-transform', name: 'debug', transforms: [{ type: 'tag-action', order: 0, modification: 'ADD', tagKey: 'debug_service', values: [serviceName] }] }); result.edges.push('service -> debug'); } result;"
},
"input": {
"isDraft": true,
"values": {
"serviceName": "web-api"
}
},
"checkValidGreprJobGraph": true
}'
Testing complex templates with validation:
curl -X POST "${GREPR_API_URL}/v1/templates/test" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"template": {
"name": "Complex Validation Template",
"inputSchema": {
"type": "object",
"properties": {
"transformations": {
"type": "array",
"items": {
"type": "object",
"properties": {
"tagKey": { "type": "string" },
"tagValue": { "type": "string" },
"operation": { "type": "string", "enum": ["ADD", "SET", "REMOVE"] }
},
"required": ["tagKey", "tagValue", "operation"]
}
}
},
"required": ["transformations"]
},
"template": "const vertices = []; const edges = []; transformations.forEach((t, i) => { vertices.push({ \"type\": \"log-transform\", \"name\": `transform${i+1}`, \"transforms\": [{ \"type\": \"tag-action\", \"order\": i, \"modification\": t.operation, \"tagKey\": t.tagKey, \"values\": [t.tagValue] }] }); if (i > 0) edges.push(`transform${i} -> transform${i+1}`); }); ({ \"vertices\": vertices, \"edges\": edges });"
},
"input": {
"isDraft": false,
"values": {
"transformations": [
{
"tagKey": "environment",
"tagValue": "production",
"operation": "ADD"
},
{
"tagKey": "processed",
"tagValue": "true",
"operation": "SET"
}
]
}
},
"checkValidGreprJobGraph": true
}'
Using Templates in Pipelines
Templates are integrated into pipelines using the TemplateOperation
. This operation type executes the template script and replaces itself with the generated subgraph during pipeline resolution.
Template Operation Configuration
The TemplateOperation
supports the following configuration:
{
"type": "template-operation",
"name": "myTemplate",
"templateId": "template-12345",
"templateVersion": 2,
"templateInputs": {
"tagKey": "environment",
"threshold": 100,
"enableDebug": true
},
"draftMode": false
}
Template Operation Parameters
templateId
(required): The ID of the template to usetemplateVersion
(optional): Specific version to use. If omitted, uses the latest versiontemplateInputs
: Key-value pairs passed to the template script as variablesdraftMode
(optional): Enables draft mode, which may add additional debug operations
Complete Pipeline Example
{
"vertices": [
{
"type": "logs-iceberg-table-source",
"name": "source",
"table": "logevent_raw_prod",
"processing": "STREAM",
"query": {
"type": "simple-query",
"query": "service:web-api"
}
},
{
"type": "template-operation",
"name": "enrichment",
"templateId": "template-12345",
"templateVersion": 3,
"templateInputs": {
"tagKey": "environment",
"threshold": 50,
"enableMetrics": true
},
"draftMode": false
},
{
"type": "template-operation",
"name": "filtering",
"templateId": "template-67890",
"templateInputs": {
"filterQuery": "status:error",
"samplingRate": 0.1
}
},
{
"type": "logs-sync-sink",
"name": "sink",
"integration": "datadog-prod"
}
],
"edges": [
"source -> enrichment:logEvents",
"enrichment:processedEvents -> filtering:input",
"filtering:output -> sink"
]
}
Template Input/Output Mapping
Templates define how external connections map to internal operations through input and output mappings:
{
"inputs": {
"logEvents": "transform1:input",
"configData": "processor:config"
},
"outputs": {
"processedEvents": "transform2:output",
"metrics": "metrics:output",
"errors": "errorHandler:output"
}
}
Connection Resolution:
- External
enrichment:logEvents
-> Internaltransform1:input
- Internal
transform2:output
-> Externalenrichment:processedEvents
- Internal
metrics:output
-> Externalenrichment:metrics
Nested Template Support
Templates can contain other template operations, enabling powerful composition patterns. Grepr resolves nested templates recursively, with a maximum depth of 10 levels to prevent infinite recursion.
Draft Mode
Draft mode is a special execution mode that enables templates to run in an alternate configuration with additional outputs. This is commonly used by the Grepr UI to show outputs from each stage when building a pipeline, allowing users to inspect intermediate results throughout the pipeline.
Enabling Draft Mode
Draft mode can be enabled in two ways:
- Template Operation: Set
draftMode: true
in the template operation - Test Endpoint: Set
input.isDraft: true
when testing templates
Draft Outputs
Templates can define special outputs that are only available in draft mode:
{
"draftOutputs": [
{
"name": "debugOutput",
"description": "Debug information for development",
"vertex": "debug"
},
{
"name": "metricsOutput",
"description": "Detailed metrics for analysis",
"vertex": "metrics"
}
]
}
Draft Mode Template Example
const result = {
"vertices": [
{
"type": "log-transform",
"name": "processor",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": "processed",
"values": ["true"]
}
]
}
],
"edges": []
};
// Add debug operations only in draft mode
if (draft) {
result.vertices.push({
"type": "log-transform",
"name": "debug",
"transforms": [
{
"type": "tag-action",
"order": 0,
"modification": "ADD",
"tagKey": "debug_timestamp",
"values": [new Date().toISOString()]
},
{
"type": "tag-action",
"order": 1,
"modification": "ADD",
"tagKey": "debug_user",
"values": [user.getName()]
}
]
});
result.vertices.push({
"type": "logs-sync-sink",
"name": "debugSink"
});
result.edges.push("processor -> debug");
result.edges.push("debug -> debugSink");
}
result;
Security and Sandboxing
Templates execute in a secure JavaScript sandbox environment designed to prevent malicious code execution while providing necessary functionality for pipeline generation.
Security Restrictions
The JavaScript execution environment enforces several security restrictions:
- No file system access: Cannot read, write, or access files
- No network access: Cannot make HTTP requests or connect to external services
- No reflection access: Cannot use Java reflection or access system classes
- No class loading: Cannot dynamically load classes or access class loaders
- No system access: Cannot access system properties or environment variables
- No native code: Cannot execute native code or shell commands
Execution Limits
Templates are subject to execution limits to prevent resource exhaustion:
- Execution timeout: Scripts are terminated if they exceed the configured timeout
- Memory limits: Scripts cannot consume excessive memory
- CPU limits: Long-running computations are interrupted
- Instruction limits: Scripts are monitored for excessive instruction execution
Allowed Java Objects
Templates can access a limited set of Java objects:
- User object: Access to current user information (e.g.,
user.getName()
) - Input parameters: All parameters defined in the template's input schema
- Built-in JavaScript objects: Standard JavaScript objects and functions
Error Handling and Debugging
Templates should handle edge cases gracefully and provide meaningful error messages. Input validation should be done generally via the input JSON schema that is defined with the template, but you can also add custom validation logic within the template script if needed.
Common Error Messages
- "Script execution returned null or undefined": Template didn't return a valid object
- "JavaScript execution failed": Syntax error or runtime exception in the template
- "Could not parse resolved template into a GreprJobGraph": Generated JSON is invalid
- "Template not found": Template ID doesn't exist or user lacks access
- "Input validation failed": Input parameters don't match the schema
Version Management
Templates support comprehensive versioning to enable safe updates, rollbacks, and concurrent development.
Version Lifecycle
- Creation: New templates start at version 1
- Updates: Each update creates a new version (versions are immutable)
- Deletion: Previous versions are tombstoned but not physically deleted
- Rollback: Can reference any previous version in pipeline operations
Version Strategies
Development Strategy:
- Use latest version during development and testing
- Pin to specific versions for production deployment
- Test new versions in staging before promoting to production
Production Strategy:
- Always pin to specific versions in production pipelines
- Plan version upgrades during maintenance windows
- Maintain rollback capability to previous versions
Version Compatibility
{
"templateOperation": {
"type": "template-operation",
"name": "processor",
"templateId": "template-12345",
"templateVersion": 3, // Pin to specific version
"templateInputs": {
"tagKey": "environment",
"threshold": 100
}
}
}
Version Migration Example
# Test new version
curl -X POST "${GREPR_API_URL}/v1/templates/test" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-d '{"template": {"...": "new version"}, "input": {"...": "test data"}}'
# Deploy new version
curl -X PUT "${GREPR_API_URL}/v1/templates/template-12345" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-d '{"version": 4, "template": "new version content"}'
# Update pipeline to use new version
curl -X PUT "${GREPR_API_URL}/v1/jobs/job-12345" \
-H "Authorization: Bearer ${JWT_TOKEN}" \
-d '{"vertices": [{"type": "template-operation", "templateVersion": 4}]}'
Best Practices
Template Design Guidelines
- Single Responsibility: Each template should have a focused, well-defined purpose
- Parameter Validation: Define comprehensive input schemas with proper constraints
- Error Handling: Include robust error handling for edge cases
- Documentation: Provide clear descriptions for templates and parameters
- Testing: Thoroughly test templates with various input combinations
Development Workflow
- Design First: Plan template structure and parameters before coding
- Iterative Development: Start simple and add complexity gradually
- Test Early: Use the test endpoint throughout development
- Version Control: Track template changes and maintain version history
- Peer Review: Have other developers review complex templates
Production Deployment
- Version Pinning: Always specify exact versions in production pipelines
- Gradual Rollout: Deploy new templates to staging before production
- Monitoring: Monitor template execution performance and error rates
- Rollback Planning: Maintain ability to rollback to previous versions
- Documentation: Keep template documentation up-to-date
Performance Guidelines
- Optimize JavaScript: Keep template scripts efficient and lightweight
- Minimize Nesting: Avoid deeply nested template compositions
- Cache Results: Cache template results where appropriate
- Monitor Resources: Track memory and CPU usage of template execution
- Regular Cleanup: Remove unused templates and old versions
Troubleshooting
Common Issues and Solutions
Template not found errors:
- Verify template ID exists and user has access
- Check if template was deleted or access permissions changed
- Ensure correct organization context
JavaScript execution failures:
- Validate JavaScript syntax using the test endpoint
- Check that all required parameters are provided
- Verify parameter types match the input schema
Invalid job graph errors:
- Ensure template returns valid JSON structure
- Verify all required fields are present in vertices and edges
- Check that operation types are valid and exist in the system
Version conflicts:
- Pin specific versions in production pipelines
- Plan version upgrades during maintenance windows
- Test new versions thoroughly before deployment
Debugging Steps
- Test in isolation: Use the templates test endpoint to validate template logic
- Check input validation: Verify input parameters match schema requirements
- Use draft mode: Enable draft mode to inspect intermediate outputs
- Simplify complexity: Remove complex logic to isolate the issue
- Validate operations: Ensure all operation types used are valid
More details on the Templates API are available in the API specification.