datasource-creator
Datasource Creator
Create OData or flow-based datasource configurations for Datex Studio.
References
- ../shared/branch-setup.md -- Branch & connection selection (shared across skills)
- references/parameter-strategies.md -- Parameter strategies, linked datasources, quoting rules, cascading params
Dependencies
REQUIRED BACKGROUND: Read the schema-explorer skill for OData entity discovery
and the odata-execution skill for query building and verification.
Schema exploration is mandatory for ODATA datasources — The type definition must be built from validated schema, not from examples or templates alone. Flow datasources might aggregate data from OData queries under the hood but if it works with already existing datasources, we assume they are already verified.
Prerequisites check
Before starting schema exploration or datasource generation, check whether a requirements brief already exists in the conversation context (produced by requirements-gathering or report-creator).
- Requirements brief exists → use it. The brief provides the field list, semantic roles, and business rules that drive which entities to explore and which fields to include. Verify datasource output against the brief.
- No requirements brief exists → invoke the
requirements-gatheringskill first. This happens when datasource-creator is invoked standalone (not from report-creator). The brief ensures you don't miss fields, map entities incorrectly, or skip calculated fields.
Do NOT skip this check. Building a datasource without understanding what fields are needed and what they mean leads to incomplete configs and rework.
Input/Output Contract
| Direction | Item | Details |
|---|---|---|
| Input | Connection ID | Required -- identifies the Footprint API connection |
| Input | Branch ID | Required -- target branch for validation and upsert |
| Input | Mode | standalone (upsert to branch) or owned (embed in report) |
| Input | Datasource type | OData (query-based) or Flow (custom JS code) |
| Output | JSON config file | Generated by dxs datasource generate or generate-flow |
| Output (standalone) | Upserted datasource | Uploaded to branch, verified with datasource-fields, test data discovered |
| Output (owned) | File path + reference name | Returned for use with --owned FILE:ALIAS on dxs report datasource add |
Steps by Mode
| Step | Standalone | Owned |
|---|---|---|
dxs datasource generate / generate-flow |
Yes | Yes |
dxs datasource context (type defs) |
Yes | Yes |
dxs datasource validate (against branch) |
Yes | Yes |
dxs datasource upsert (to branch) |
Yes | No |
datasource-fields (post-upsert verification) |
Yes | No (deferred to after report upload) |
Test data / in_params discovery |
Yes | No (report-creator handles) |
Workflow
[requirements brief in context?]
|
+-----+-----+
| |
YES NO
| |
use it invoke `requirements-gathering` skill
| |
+-----+------+
|
[determine type: OData or Flow]
|
+-----+-----+
| |
OData Flow
| |
schema -> schema -> (REQUIRED: validate entities/properties
query -> the flow code will query against the connection)
generate create standalone OData datasources (generate + upsert each)
| write type-def YAML (from validated schema, NOT from examples)
| write flow TS code (referencing $datasources.<RepoName>.<ref_name>)
| generate-flow
| |
+-----+------+
|
context (dxs datasource context - get type defs)
validate (dxs datasource validate - both modes)
|
+-----+-----+
| |
Standalone Owned
| |
upsert return JSON file path
fields + ref name
test data
return ref
When to use OData vs Flow
| Use OData when | Use Flow when |
|---|---|
| All needed fields are scalar or reachable through single navigation properties (no collections in the path) | The result needs data from collection navigation properties flattened into scalar fields (e.g., a single-entity shipment where OrderLookups or WarehousesContactsLookup must appear as flat fields) |
| The result IS a collection that maps directly to a table/tablix (e.g., ShipmentLines) | Multiple OData queries need to be combined or joined into a single result |
| Simple parameter-based filtering is sufficient | Calculated fields require data from multiple entities or custom aggregation |
How to tell: Run dxs report datasource-fields <ref> --branch <id> (or --report <ref> for owned). If the output has a collections: section with fields you need in standalone textboxes (not tables), use a flow datasource. Collections in OData datasources cannot be bound as flat DataSet fields — they silently resolve to blank.
The production pattern: All existing Datex Studio reports with complex navigation (packing slips, BOLs, master BOLs) use flow datasources for their header/detail data. The flow code fetches the OData entities and flattens collections into scalar fields. OData datasources are used for simple list queries (line items, lookup tables).
Flow datasources and schema exploration: Flow datasources are NOT a shortcut around schema exploration. A flow datasource's JavaScript code typically queries one or more OData entities and reshapes the data. Before writing the type definition or flow code, you MUST use
schema-explorerto validate that the target connection has the expected entities and properties. Existing report examples and templates show what has worked on some connection — they are starting hypotheses, not validated designs for the current connection.
Return to Caller
After completing the workflow (either standalone or owned), return this structured summary to the calling skill or user:
- Reference name — the
-rvalue (e.g.,ds_shipment_bol) - File path — the
-ooutput path (e.g.,reports/bol/ds_shipment_bol.json) - Mode —
standalone(upserted) orowned(local file for--owned) - Result type —
singleorcollection(from the generated config'sresultIsCollection) - in_params — list of input parameter names and types (from the config's
inParams), or empty if none - Field summary — read the generated JSON config and extract the field tree from
queryOptionsObjectTypeDef. List fields asname: typewith dot-notation for nested objects. Mark collections. This gives the caller a machine-derived field list without re-reading the field-mapping artifact.
Example return:
Datasource: ds_shipment_bol
File: reports/bol/ds_shipment_bol.json
Mode: owned
Result type: single
in_params: shipmentId (number)
Fields:
Id: number (key)
BillOfLading: string
LookupCode: string
Carrier.Name: string
Carrier.ScacCode: string
Status.Name: string
ShipmentLines [collection]:
LineNumber: number
OrderLine.Material.LookupCode: string
OrderLine.Material.Description: string
Collection fields in the return require caller action. When the field summary contains [collection] markers, the caller must choose:
- Flow datasource (preferred): Rewrite the datasource as a flow that flattens collections into scalar fields. The caller gets a flat field list with no collections.
- Child datasets (alternative): Create a separate DataSet in the report with
CommandText: "$.ds_name.result.CollectionPath.*"and use=First(Fields!Field.Value, "child_dataset")in standalone textboxes. This works but adds complexity. - Separate OData datasource: Query the collection entity directly (e.g.,
ShipmentOrderLookups?$filter=ShipmentId eq {id}) as its own datasource. Only viable when the entity supports direct filtering.
Never pass collection-path fields (e.g., OrderLookups.Order.OwnerReference) to dxs report dataset add --field as flat fields on a single-result DataSet. They will silently render blank.
OData Datasource Generation
Generate an OData datasource config with dxs datasource generate:
dxs datasource generate \
-c <connection_id> \
-q '<odata_query>' \
-r <reference_name> \
-t "<reference_name>" \
-d "<description>" \
--api-setting-name <app_level_name> \
-o ds_name.json \
--branch <branch_id>
Key Flags
| Flag | Purpose |
|---|---|
-c |
Connection ID |
-q / -Q |
OData query string / query file (mutually exclusive) |
-r |
Reference name (valid JS identifier, ds_ prefix convention) |
-t |
Display title (must match -r -- see Naming Convention) |
-d |
Description (always provide) |
--api-setting-name |
App-level name from dxs source branch settings (NOT the manager connection name) |
--param-keys |
For single-entity queries with Entity(0) pattern |
--detect-params |
Detect required filter parameters using ${$datasource.inParams.paramName} syntax |
--dynamic-filter PROP:TYPE |
Optional UI filtering (generates conditional filter with $utils.isDefined() guard) |
--dynamic-orderby PROP |
Optional UI sorting |
--param-filter PROPERTY:OPERATOR:PARAM_NAME:TYPE |
Conditional filters with $utils.isDefined() guards. Operators: eq, ne, gt, ge, lt, le, in, contains, startswith, endswith |
--linked name:type:target |
Linked datasource. oneToOne/oneToMany = 3-part; oneToOneWithMerge = 4-part with $entity.Field |
--linked-param LINKED_NAME:PARAM_ID:EXPRESSION |
Map parent fields to linked datasource input parameters |
--custom-column |
Add computed columns |
--private |
Set access modifier to private (default: public) |
-o |
Output file path |
--branch |
Target branch ID |
Parameter Strategy
| Need | Flag | Query syntax |
|---|---|---|
| Scope to one entity (detail/document) | --param-keys |
Entity(0)?$select=... |
| Required filter params (report passes values) | --detect-params |
Use ${$datasource.inParams.paramName} in $filter |
| Optional UI list filtering | --dynamic-filter PROP:TYPE + --dynamic-orderby PROP |
Entity?$select=... (no placeholders) |
| Conditional filters with guards | --param-filter PROPERTY:OPERATOR:PARAM_NAME:TYPE |
Auto-generates $utils.isDefined() guard |
Flow Datasource Generation
Flow Runtime Model (CRITICAL)
Flow datasource code does NOT execute raw OData queries. Instead, flows reference standalone OData datasources that already exist on the branch, using the $datasources object. Standalone datasources are scoped under their repository module name (the repo's uniqueIdentifier name, e.g., PurchaseOrders, AsnOrders), so the path is $datasources.<RepoName>.<ref_name>:
// CORRECT: Reference standalone OData datasources with module scope, unwrap .result
const shipmentResp = await $datasources.PurchaseOrders.ds_shipment.get({ shipmentId: $flow.inParams.shipmentId });
const shipment = shipmentResp.result;
// For collection datasources, .result is an array:
const contactsResp = await $datasources.PurchaseOrders.ds_wh_contacts.get({ warehouseId: shipment.ActualWarehouse.Id });
const firstContact = contactsResp.result?.[0]?.Contact;
// Set output via $flow.outParams.result
$flow.outParams.result = { Name: shipment.Name, Phone: firstContact?.PrimaryTelephone };
// WRONG: Missing module scope — datasources are not at the root level
const result = await $datasources.ds_shipment.get({ shipmentId: 123 });
// WRONG: Raw OData queries in flow code — this is NOT how flows work
const result = await $datasource.getList({ query: 'Shipments(123)?$expand=Carrier' });
Flow code runtime variables:
| Variable | Purpose |
|---|---|
$flow.inParams |
Access the flow datasource's input parameters |
$flow.outParams.result |
Set the flow's output (assign, don't return) |
$datasources.<RepoName>.<ref_name> |
Access standalone datasources on the branch (module-scoped by repository name) |
Unwrapping responses: All $datasources calls return { result?: ... }. For --param-keys datasources, result is a single object. For collection datasources, result is an array. Always access .result before navigating into fields.
The workflow for building a flow datasource:
- Create standalone OData datasources for each query the flow needs — use
dxs datasource generate+dxs datasource upsertfor each - Write the flow code referencing those datasources via
$datasources.<RepoName>.<ref_name>.get()or$datasources.<RepoName>.<ref_name>.getList()(where<RepoName>is the repository'snamefromdxs source repo list) - Generate the flow config with
dxs datasource generate-flow, which embeds the flow code and type definition - The flow datasource itself can be owned (embedded in the report), but its OData dependencies must be standalone on the branch
$datasources API:
| Method | Use when | Returns |
|---|---|---|
$datasources.RepoName.ds_name.get({ paramName: value }) |
The OData datasource uses --param-keys (single entity) |
Single object |
$datasources.RepoName.ds_name.getList({ paramName: value }) |
The OData datasource returns a collection | Array of objects |
Pass input parameters as an object — the keys must match the OData datasource's inParams exactly.
--param-keys creates named params, NOT a keys array. When an OData datasource uses --param-keys (e.g., Shipments(0)), the generator creates inParams named after the entity key (e.g., shipmentId). Always check the generated config's inParams to get the exact param name. Never use { keys: [value] } — that pattern does not exist.
// CORRECT: Use the actual inParam name from the generated config
const resp = await $datasources.ds_shipment.get({ shipmentId: $flow.inParams.shipmentId });
// WRONG: There is no "keys" parameter — this causes TS compilation errors
const resp = await $datasources.ds_shipment.get({ keys: [$flow.inParams.shipmentId] });
Generation Command
Generate a flow datasource config with dxs datasource generate-flow:
dxs datasource generate-flow \
-r ds_lookup -t "ds_lookup" -d "Custom lookup datasource" \
--type-def types.yaml \
--get-flow get.ts \
--in-param id:number \
-o ds_lookup.json --branch <branch_id>
Required: At least one flow method (--get-flow, --get-list-flow, or --get-by-keys-flow) and a type definition file (--type-def).
Key Flags
| Flag | Purpose |
|---|---|
--type-def FILE |
YAML/JSON file defining output type shape (required) |
--get-flow FILE |
JavaScript code for single-entity retrieval |
--get-list-flow FILE |
JavaScript code for collection retrieval |
--get-by-keys-flow FILE |
JavaScript code for key-based retrieval (requires --key) |
--on-init-flow FILE |
JavaScript code for initialization |
--in-param NAME:TYPE |
Input parameter (append ? for optional, e.g., search:string?). Repeatable |
--key NAME:TYPE |
Key field definition. Repeatable |
--collection |
Force resultIsCollection=true |
--single |
Force resultIsCollection=false |
Type Definition YAML Format
- id: Id
type: number
- id: Name
type: string
- id: Items
type: object
isCollection: true
objectTypeDef:
- id: LineNumber
type: number
Valid types: string, number, boolean, date, object, union, blob.
Enhancement flags (--dynamic-filter, --linked, --custom-column, etc.) work the same as OData datasources.
File size limit: All code files and the type definition file are limited to 512 KB.
Context Command
See ../shared/context-navigation.md for the full guide on retrieving and reading context responses, including backend vs frontend symbol filtering.
dxs -O json datasource context <file.json> --branch <branch_id>
Returns designer type definitions for writing expressions or TypeScript code. Works for both OData and flow datasources. Run after generate / generate-flow to understand the available fields before writing custom columns or flow code.
For datasources, the primary scope symbols are $entity, $ccentity, and $datasource (in flowContext or linkedDatasourcesContext). The appContext contains additional services — read defaultContext.imports to determine which ones are available (see the shared reference).
Validation
dxs datasource validate <file.json> --branch <branch_id>
Validates the datasource config against the branch. Run for BOTH standalone and owned modes before proceeding to upsert or report upload.
Standalone Completion
After validation passes, complete the standalone workflow:
1. Upsert
dxs datasource upsert <file.json> --branch <branch_id>
2. Verify Fields
dxs report datasource-fields <reference_name> --branch <branch_id>
Check these in the output:
in_paramsnames -- must match exactly in--datasource-param(don't assumeid)result_type--singlevslistaffects report layoutcollections-- nav-property collections available for table sections- Field paths -- exact dot-notation paths for expressions
3. Discover Test Data
Query the entity without template literal params, using the base filter + $top=5 + $orderby to find recent records:
dxs odata execute -c <id> \
-q 'Entity?$top=5&$filter=<base_filters>&$select=Id,<param_fields>&$orderby=<date_field> desc'
Verify count with $count=true&$top=1:
dxs odata execute -c <id> \
-q 'Entity?$count=true&$top=1&$filter=<full_filter_with_real_values>&$select=Id'
4. Deleting a Datasource
Delete by reference name or by config ID:
dxs datasource delete ds_my_report --branch <branch_id>
dxs datasource delete --id 42 --branch <branch_id>
Use --id when reference-name lookup returns 404 (can happen on branches with component modules). Get the ID from dxs datasource list.
Naming Convention (CRITICAL)
The datasource reference name (-r), display title (-t), the RDLX-JSON DataSet name, and the --owned alias must ALL be identical:
-r ds_my_report -t "ds_my_report" # generate: -t = -r
DataSet.Name = "ds_my_report" # RDLX-JSON
--owned ds_my_report.json:ds_my_report # report datasource add: file:alias
Rules:
- Use
ds_prefix convention - Must be valid JS identifiers (start with letter/
_/$, no spaces/hyphens, no leading digits) - Examples:
ds_shipment_bol,ds_orders,ds_inventory_summary
Key Rules
- Use
schema batchto combine multiple schema discovery calls into 2-3 requests instead of 9+ sequential calls - Check composite keys -- some entities have multi-field keys that affect
--param-keysbehavior - Use
$top=1during query testing -- large queries timeout without limits - Single quotes for
$values -- shell expands$in double quotes; use single quotes for-q,--linked,--custom-column,--datasource-param,--linked-param - Always include
$selectin$expand-- bare$expandwithout$selectpulls all fields
Common Mistakes
| Mistake | Fix |
|---|---|
Using manager connection name as --api-setting-name |
Use app-level name from branch settings |
{Param} instead of ${$datasource.inParams.Param} in filter |
--detect-params requires template literal syntax -- simple {curly braces} are silently ignored |
Using only --dynamic-filter for required params |
Dynamic filters are optional UI filters -- use --detect-params with template literals for required params |
Not verifying in_params after upsert |
Always run datasource-fields and confirm in_params is populated, not empty |
Assuming inParam name is id |
Check datasource-fields output -- might be shipmentId, orderId, etc. |
Datasource -t title differs from -r reference name |
Title and reference must be identical (e.g., -r ds_foo -t "ds_foo") |
| Linked target doesn't exist | Create targets first, verify with datasource-fields |
mergeByValue on oneToOne |
Only oneToOneWithMerge gets 4th component |
| Hardcoded dates in filter | Use ${new Date(...).toISOString()} for dynamic |
| Raw OData queries in flow code | Flow code must reference standalone datasources via $datasources.RepoName.ds_name.get() / .getList() — never embed OData query strings |
$datasources.ds_name without module scope |
Standalone datasources are scoped under the repository module — use $datasources.RepoName.ds_name (e.g., $datasources.PurchaseOrders.ds_shipment). Without the module prefix, the flow validator reports "Property does not exist on type 'IDatasourceService'" |
| Flow datasource without standalone OData dependencies on the branch | Create and upsert the OData datasources first, then write the flow that references them |