Import Test Data
After installation, you can generate realistic demo data to explore Primentra without entering everything by hand. The generate-demo-data.ps1 PowerShell script creates models, entities, attributes, and data rows directly in your database.
How to run it:
- Open PowerShell as Administrator
- Navigate to the Primentra scripts folder:
- Run the script:
The script prompts you for six settings:
| Setting | Description | Default |
|---|---|---|
| Models | Number of models to create (1–10) | 3 |
| Entities per model | Number of entities per model (1–8) | 4 |
| Min records | Minimum rows per entity (0–1,000,000) | 0 |
| Max records | Maximum rows per entity (0–1,000,000) | 50 |
| Avg attributes | Average attributes per entity (1–30) | Automatic (4–8 per entity) |
| Delete existing data | Wipe all data first, or add on top of existing data | No (additive) |
Each entity gets a random number of rows between the minimum and maximum. You can also pass the first five as parameters to skip their prompts:
Additive by default — the script adds data on top of whatever is already in the database. It picks from 10 built-in themes (Organisation, HR, Finance, Logistics, Sales, IT, Quality, Procurement, Marketing, Facilities) and avoids duplicating model names that already exist.
Performance notes:
- Small entities (under 5,000 rows) are generated in PowerShell and inserted as SQL VALUES statements
- Large entities (5,000+ rows) are generated server-side using SQL CTEs — much faster for high volumes
- Progress is printed every 10,000 rows so you can track long-running inserts
- 99,000 rows typically completes in under 30 seconds
Disk space warning for large imports:
Inserting millions of rows can cause your SQL Server transaction log to grow significantly, especially if the database is in Full recovery mode. Before running large imports (500,000+ rows), check the following:
- Transaction log space — Run
DBCC SQLPERF(LOGSPACE)in SSMS to see current log usage. If log space is tight, either switch the database to Simple recovery mode temporarily, or schedule regular log backups during the import. - Data file space — The
EntityRowsandAttributeValuestables grow proportionally with row count. A 1,000,000-row import with 10 attributes per entity adds roughly 1–2 GB of data depending on value types. - Monitor during import — For very large runs, keep an eye on disk usage via Windows Explorer or
SELECT * FROM sys.dm_db_file_space_usagein SSMS. If disk space runs out mid-import, the script will fail and leave partial data that needs cleaning up.