Skip to content

Install Scanner

Overview

Tycho Data Osprey provides lightweight scanners that connect to your existing PI System infrastructure to collect metadata, evaluate data quality, and power dashboards, workflows, and alerts. This quickstart guide walks you through downloading, configuring, and running the scanner on Windows. By the end, you'll have one or more scanners running on a schedule, reporting real-time data quality status for PI Data Archive, PI Asset Framework, and PI Vision.

Prerequisites

Ensure the following requirements are met before running the scanner:

Requirements

  • .NET Framework 4.8 must be installed on the scanner host machine
  • PI AF Explorer must be installed on the scanner host machine
  • The scanner hose machine must be network routable to PI Data Archive, PI Asset Framework, and SQL Server containg the PI Vision database
  • Windows user (AF access) requires read access to PI Asset Framework
  • Windows user (PI access) requires read access to PI Data Archive (data and point security)
  • Windows user (PI Vision access) requires read-only access to the PI Vision SQL database. See Preparing authentication in PI Vision Scanner.

Pre-flight Check

Ensure Tycho Data Osprey can access Azure Data Lake Storage Gen2 for scanners. Verify the following:

  • Ensure the server has outbound internet access to Azure endpoints.
  • Confirm the firewall rules allow traffic to *.dfs.core.windows.net.
  • Use Test-NetConnection in PowerShell to validate connectivity:

powershell Test-NetConnection -ComputerName tychodataosprey.dfs.core.windows.net -Port 443

  • Use a browser to confirm the following test file can be downloaded to verify connectivity:

https://tychodataosprey.blob.core.windows.net/scanners/osprey1.2.3.zip

If the download succeeds, connectivity to Azure Data Lake Storage Gen2 is verified.

Installation

  1. Download the Scanner

    Navigate to the /downloads page such as https://<your-osprey-instance>/downloads

  2. Extract the Scanner

    Extract the contents to C:/Program Files (x86)/Tycho Data/Osprey

  3. Set up Configuration File

    If this is a new installation, rename the appsettings.ini.sample file in the extracted directory to appsettings.ini

Add Scanner Instances and API Token in Osprey

  1. Create Scanners in Osprey UI

    • In Osprey, go to Scanners from the left navigation

    • Click Add Scanner

    • Select type: PI Data Archive, PI Asset Framework, or PI Vision

    • Enter a name

    • Copy the Scanner ID

    • Repeat for each system you want to monitor

  2. Create an API Token

    • In Osprey, from your profile dropdown, select Settings

    • Click API Tokens

    • Set Description to PI Scanners

    • Set Expiration (optional). 0 means API Token never expires.

    • Click Generate.

    • Save the API Token. You will need it for the scanner configuration file.

We recommend one scanner per PI Data Archive, AF server, and PI Vision server, unless you decide to parallelize some scanners (for instance across multiple PI AF Databases).

Common Configuration

You will next configure the scanner through appsettings.ini file.

Hint: You can leave optional parameters empty if not using those

These settings go under the [common] section. Populate with your information:

Key Description Required Example
tychoDataHost Your Osprey instance URL Yes https://experience.tychodata.com/
tychoDataDomainID Your domain ID Yes 30
tychoDataAPIToken API token (use env var) Yes ${TYCHODATA_OSPREY_APITOKEN}
pathToCert Path to SSL certificate (optional) No C:\cert\selfsigned.crt
updateStatus Report scan status periodically to Osprey No false
schedule Cron string to control scan frequency Yes * */2 * * *

Cron is used to define how frequently the scanner will run. Examples are provided below:

Description Cron String
Run every hour 0 * * * *
Run every four hours 0 */4 * * *
Run every day at 3am 0 3 * * *

An example is provided below:

[common]
tychoDataHost = https://experience.tychodata.com/
tychoDataDomainID = 30
tychoDataAPIToken = ${TychoData_APIToken}
pathToCert = 
updateStatus = false

Configuring scan-pidataarchive

Populate the below with information on your PI Data Archive instance

Hint: You can leave optional parameters empty if not using those

Key Description Required Example
scannerId From Osprey UI Yes 74ab...
serverName Hostname of the PI Data Archive server Yes WIN-PI-SERVER
piUser Leave blank for Windows-authenticated access No
piPassword Leave blank for Windows-authenticated access No
dataQualityLevel See table below Yes 2
pointsource_filter Wildcard for filtering tags Yes *
dataQualityStartTime Start of time window to evaluate data Yes *-1h
dataQualityEndTime End of time window to evaluate data No *
performScan Run scanner locally Yes true
upload Send results to Osprey Yes true

An example is provided below:

[scan-pidataarchive]
scannerId = <copied-from-osprey>
serverName = WIN-PI-SERVER
piUser =
piPassword =
dataQualityLevel = 2
pointsource_filter = *
dataQualityStartTime = *-1d
performScan = true
upload = true

Configuring scan-piassetframework

Populate the below with information on your PI Asset Framework instance

Hint: You can leave optional parameters empty if not using those

Key Description Required Example
scannerId From Osprey UI Yes b0ad...
serverName AF Server hostname Yes WIN-AF-SERVER
databaseNames Comma-separated AF databases Yes AFDB1, AFDB2
afUser AF credentials (use env vars) No ${AF_User}
afPassword AF credentials (use env vars) No ${AF_UserPassword}
dataQualityLevel See table below Yes 2
dataQualityStartTime Start of time window for scanning linked tags Yes *-1h
dataQualityEndTime End of time window for scanning linked tags No *
scanAnalysisRuntime Include AF Analysis runtime No true
performScan Run scan locally Yes true
upload Upload results to Osprey Yes true
includeAFServerAsset Include AF server as asset No true
includeFiltersRegex Regex for inclusion No \\\\Server\\DB\\North America.*
excludeFiltersRegex Regex for exclusion No \\\\Server\\DB\\North America\\Under Construction.*

An example is provided below:

[scan-piassetframework]
scannerId = <copied-from-osprey>
serverName = WIN-AF-SERVER
databaseNames = AFDB1, AFDB2
afUser = ${AF_User}
afPassword = ${AF_Password}
dataQualityLevel = 2
dataQualityStartTime = *-1d
dataQualityEndTime = *
scanAnalysisRuntime = true
performScan = true
upload = true
includeAFServerAsset = true
includeFiltersRegex = \\\\Server\\DB\\North
America.*|\\\\Server\\DB\\LATAM\\Argentina.*
excludeFiltersRegex = \\\\Server\\DB\\North America\\Under
Construction.*

Configuring scan-pivision

Populate the below with information on your PI Vision instance

Hint: You can leave optional parameters empty if not using those. The AF user credentials are required to help resolve data source references. For instance for PI Vision collections, the AF attribute or tag may not be explicitly stored in PI Vision.

Key Description Required Example
scannerId From Osprey UI Yes c684...
piVisionServer Hostname of PI Vision Web Server Yes WIN-VISION-SERVER
sqlHost SQL Server hostname Yes WIN-VISION-SERVER\MSSQLSERVER (for named instance) or WIN-VISION-SERVER
sqlPort SQL port (default 1433) Yes 1433
sqlDatabase PI Vision database name Yes PIVision
sqlUsername SQL credentials (Will default to Windows authentication if empty) No ${PIVisionSQLUser}
sqlPassword SQL credentials (Will default to Windows authentication if empty) No ${PIVisionSQLUserPassword}
afUser AF credentials for reference resolution No ${AF_User}
afPassword AF credentials for reference resolution No ${AF_User_Password}
performScan Run scanner locally Yes true
upload Upload results to Osprey Yes true

An example is provided below:

[scan-pivision]
scannerId = <copied-from-osprey>
piVisionServer = WIN-VISION-SERVER
sqlHost = WIN-VISION-SERVER\\MSSQLSERVER
sqlPort = 1433
sqlDatabase = PIVision
sqlUsername = 
sqlPassword = 
afUser = ${AF_User}
afPassword = ${AF_Password}
performScan = true
upload = true

Understanding Key Parameters

dataQualityLevel

Value Meaning
0 No time-series data quality values collected
1 Only current values scanned for data quality
2 Summary statistics scanned (min, max, % good, etc.)

Recommended: Use 2 for complete analysis

performScan vs upload

When should you change performScan and upload parameters?

Scenario performScan upload
Standard use (scan + upload) true true
No network access (air-gapped) true false
Upload existing scan results only false true

Running the Scanner

Once your configuration is ready:

  1. Open PowerShell
  2. Navigate to the scanner directory
  3. Run:
.\TychoScanner.exe

The scanner will run based on the schedule you defined in the [common] section.

After the Scanner Runs

Once the scanner completes a run and uploads results to Tycho Data Osprey:

  1. Review logs to confirm upload was successful Osprey automatically ingests the uploaded data and runs a set of backend automations.

  2. Navigate to the Osprey web interface Go to your instance (e.g., https://experience.tychodata.com) and log in.

  3. Review automated results Osprey will process and generate:

    • Data Quality Reports

    • Lineage Mapping

    • Blast Radius Analysis

Once this process completes, you'll have full visibility into the health and impact of your PI System data.

FAQ

How does the scanner authenticate to PI AF?

The scanner authenticates to PI Asset Framework (AF) based on the Windows user account that runs the scanner process. To ensure successful authentication, you should run the scanner as a specific user or service account that has read access to PI AF. Make sure this account is granted the necessary permissions in PI AF to allow the scanner to collect metadata and perform its tasks.

What is the server-alias?

A server-alias is a mapping that allows you to define alternate names for a server instance. This is useful when different systems (such as AF or PI Vision) reference the same PI Data Archive using different identifiers, such as an IP address or a fully qualified domain name (FQDN).

For example, in your configuration file:

[server-aliases]
10.1.2.5 = na.pi.company.com

This tells Osprey that any reference to 10.1.2.5 should be treated as na.pi.company.com. This helps resolve lineage and data quality issues that arise when the same server is referenced by different names across your environment. Defining these aliases ensures that all relationships, asset mappings, and visualizations are unified under a single canonical server name.