T

TIA Openness Manager
Loading...

Security & Data Handling

How we protect your engineering data

What leaves your machine, what doesn't, and how we secure everything in between.

TIA Portal V15-V21
Windows 10/11
MCP Tools
OPC UA Client

Our Security Principles

Local-First Processing

TIA Portal projects are processed exclusively on your local machine through the Siemens Openness interface.

The following never leaves your workstation through our software:

  • Project contents and TIA Portal project files
  • PLC code (SCL, STL, LAD, FBD, S7DCL, SPL)
  • Exported XML, block sources, or WinCC Unified screens
  • Hardware configurations
  • Tag tables, variable values, or online diagnostic data

Your engineering data stays where it belongs — on the workstation running TIA Portal.

No Telemetry, No Tracking

The application performs no usage analytics, no crash telemetry, and no behavioural tracking. There are no third-party analytics SDKs embedded in the product.

  • No session tracking
  • No feature-usage beacons
  • No automatic error reports sent to the vendor
  • No A/B testing, no experiments

If you need to share diagnostic information for a support case, you explicitly export a log bundle and send it yourself — nothing is uploaded in the background.

License Server — The Only Outbound Connection

The only data ever transmitted to our license server is:

  • Customer ID (assigned at purchase)
  • Hardware ID (a local fingerprint of your workstation, derived from hardware identifiers)
  • Timestamp

That is the full payload. No engineering data, no project metadata, no usage statistics, no file names.

Verification guarantees:

  • License responses are digitally signed with ECDSA. A tampered or forged response is rejected by the client before it is trusted.
  • The Hardware ID must match the record bound at activation — so a valid license response cannot be replayed on a different machine.
  • Rate limiting protects against brute-force probing of customer IDs.

The application operates offline for up to 14 days between validations — useful for air-gapped engineering workstations.

Credentials & API Keys

All sensitive credentials are stored in your operating system's protected credential vault — never in plaintext configuration files, log files, or export files.

This applies to every credential the application handles:

  • AI provider API keys (Anthropic, OpenAI, Google, Azure, AWS, etc.)
  • OAuth tokens for third-party integrations
  • PLCSIM Advanced master-secret passwords
  • Git hosting credentials and personal access tokens
  • Web search API keys
  • MCP server credentials

If you uninstall the application, these credentials remain under the control of your operating system account until you delete them — they are not written to application-managed files that might end up in backups or support bundles.

Password Vault (Know-how Protection)

The built-in password vault for TIA Portal know-how protection passwords uses industry-standard authenticated encryption:

  • AES-256-GCM with a 128-bit authentication tag — any tampering with the vault file is detected on open
  • PBKDF2-SHA256 key derivation with 600,000 iterations — far above the OWASP-recommended minimum
  • A fresh random 96-bit nonce for every write operation
  • Plaintext is explicitly zeroed from memory after use

The master password itself is never stored. Only the derived key material is held in memory, and only while the vault is unlocked. Locking the vault clears the key.

The vault file is a standalone encrypted file you fully control — back it up, move it between workstations, or delete it. Without the master password, the contents are unrecoverable.

AI Features — Opt-In by Configuration

AI features are opt-in by configuration. No AI traffic ever leaves your machine unless you explicitly configure a provider and supply an API key.

You choose the provider:

  • Anthropic Claude
  • OpenAI (Chat and Responses API)
  • Azure OpenAI (for customers under their own Azure data-residency agreement)
  • Google Gemini
  • Google Vertex AI
  • AWS Bedrock
  • Ollama — for fully local, offline inference on your own hardware or on-premise server

For confidential engineering projects, Ollama gives you a complete local AI workflow with no external traffic at all.

You decide, per session, which content is shared with the chosen provider. Files, blocks, and attachments are only included in a request when you actively add them to the chat — there is no automatic ingestion of your project.

Release Integrity

Every release of the application is digitally signed with a SHA-256 code-signing certificate and timestamped so Windows can verify integrity before installation. SmartScreen and Group Policy can enforce signature checks against our publisher identity.

Updates are delivered through a built-in auto-update mechanism that validates signatures before installing — an unsigned or tampered update package is rejected.

Local Data Location

All application data is stored under your Windows user profile at:

%LocalAppData%\TiaOpennessManager

This includes log files, user settings, chat history, the encrypted vault file, and local caches. None of these paths are synced, uploaded, or shared by the application.

If your organisation uses roaming profiles or enforces specific data-locality rules, the data stays under the policies applied to %LocalAppData% — the application does not create copies elsewhere.

Recommendations for Confidential Projects

For engineering projects under NDA or strict confidentiality requirements, we recommend the following configuration:

  • Use Ollama for AI features — fully local inference, no external provider involved
  • Or leave AI features unconfigured — the chat panel simply stays idle
  • Keep the password vault active for know-how protected blocks
  • Use the built-in Git client for traceable, auditable version history of your exports
  • Review the contents of log bundles before sharing them for support — by default they stay local
  • Install on a workstation covered by your corporate security policy (full-disk encryption, endpoint protection, least-privilege account)

If you have specific internal security or compliance requirements, contact us — we are happy to walk through your setup in a technical call and answer questions about data flow, credential handling, and deployment options.

Questions?

For detailed questions about data handling, security architecture, or compliance:

Email: support@tiaopenessmanager.ch

See also our Privacy Policy for the formal data-protection notice and our Legal Notice for provider information.


© 2025-2026 AnyAutomation. All rights reserved.



© 2026 AnyAutomation - TIA Openness Manager

Contact: support@tiaopenessmanager.ch