Skip to main content

Scripting in Torq Workflows: Bring Your Own Code

Incorporate scripts into Torq workflows using dedicated steps for Python, JavaScript, PowerShell, and Bash, with support for AI completion.

Updated yesterday

Use scripts in your workflows to enhance flexibility within edge-case automations. If you already have a script written for a task, you can easily repurpose it within Torq. All of the supported languages have dedicated steps. Simply paste your script into the input of the scripting step, and it will be ready to run. Some scripting steps also allow for AI completion.

Start with the built-in steps. Most API calls, condition checks, and notifications can already be handled natively in Torq. Use these first before resorting to custom code, they’re simpler, faster to maintain, and fully supported.

Find the scripting steps easily by typing script into the search bar in Torq's Builderbox. You'll find them listed under the Scripting category.

Python

Comes with built-in support for running Python scripts across various use cases, each with pre-installed libraries tailored to specific needs. You can also use the optional REQUIREMENTS variable to install additional packages as needed.

For example, you can add this at the top of your Python script:
REQUIREMENTS = ["pandas"]

All Python script steps come with:

  • Python Standard Library

  • pyOpenSSL

  • crcmod

  • requests

Available Script Steps

  • Run a Python Script: Ideal for general-purpose scripting and API interactions. Includes only the standard set of pre-installed packages listed above.

  • Run a Python Data Processing Script: Best for data wrangling, transformation, and Excel file manipulation. Extends the base environment with the following additional packages:

    • pandas

    • numpy

    • openpyxl

  • Run a Python Script with Database Tools: Useful for querying, inserting, or analyzing data stored in external databases. Includes all packages from the Data Processing step, plus database connectivity tools:

    • psycopg (PostgreSQL)

    • PyMySQL (MySQL)

    • pymssql (Microsoft SQL Server)

  • Run a Python IOC Extraction Script: Perfect for extracting Indicators of Compromise (IOCs) and structuring threat-related data. Specialized for security automation and threat intelligence parsing, pre-installed with:

    • ioc-finder

    • msticpy

    • pydantic

    • jinja2

    • ruamel.yaml

    • python-json-logger

File Output

Torq’s Python scripting supports file output, where scripts generate structured or large data, such as reports, CSVs, or Excel-like files. Instead of returning raw text, scripts can create a file, encode it in base64, and pass it along for downstream steps.

This is helpful when dealing with large outputs or content that benefits from a specific format. It ensures better structure, avoids hitting output size limits, and aligns with common Python workflows where file generation is standard practice.

Learn how to handle large or binary HTTP responses by converting them to files in Torq workflows in this KB article.

PowerShell Core

Executes scripts and comes preinstalled with the PSWSMan and ExchangeOnlineManagement modules. To use additional modules, specify them using the MODULES parameter.
For example, if you need to run SQL commands in PowerShell Core, add the SqlServer module via the MODULES parameter, and include Import-Module SqlServer in your script to load it before execution.

Bash

Torq supports Bash scripting as a powerful way to extend and customize workflows using standard UNIX shell commands. This allows users to perform complex text processing, data transformation, and system interactions directly within their automations.

Bash scripts in Torq can leverage a wide range of common UNIX utilities, including:

  • sed: Stream editor for modifying and transforming text

  • grep: Powerful pattern matching and search tool

  • jq: Command-line JSON processor for querying and manipulating JSON data

  • pup: HTML parser that makes it easy to extract elements from web content

  • faq: CLI tool for transforming structured data between formats like JSON, YAML, XML, and TOML

You can incorporate other standard tools to match your automation needs.

This makes Bash scripting in Torq ideal for scenarios like:

  • Parsing API responses

  • Filtering and reformatting logs or payloads

  • Automating system-level tasks

  • Combining outputs from multiple steps for use later in the workflow

By integrating Bash into workflows, users can handle edge cases and complex conditions with precise, low-level control.

JavaScript

Torq supports JavaScript scripting using Node.js, allowing users to write and execute custom logic directly within their workflows. This provides powerful flexibility for manipulating data, implementing dynamic logic, or integrating with APIs beyond the capabilities of prebuilt workflow steps.

Key features include:

  • Full Node.js Environment: Scripts run in a Node.js runtime, giving access to modern JavaScript syntax (ES6+) and core modules.

  • Workflow Integration: Scripts can access and manipulate workflow context, step outputs, and runtime variables to drive decisions or reformat data.

  • Custom Logic: Ideal for conditional logic, loops, complex data transformations, or handling edge cases not covered by built-in steps.

  • Data Handling: Easily work with JSON, arrays, strings, numbers, and dates using native JavaScript methods.

  • Modularity: Keep scripts organized and reusable across workflows by encapsulating logic in functions.

By using JavaScript, developers and technical users gain the control and flexibility needed for advanced automation scenarios, without leaving the no-code/low-code environment of Torq.

Best Practices

Scripting in Torq is powerful, especially when native steps can’t fully address your use case. To get the most out of your scripts in workflows, follow these best practices.

  • Favor Out-of-the-Box Steps First: Most API calls, condition checks, and notifications can be handled natively. Explore built-in steps before adding code. That said, scripting is ideal when:

    • You need complex logic or processing not supported natively

    • You're integrating with a custom system or proprietary data format

    • You're performing data transformation or enrichment with external libraries

  • Use Scripts as Part of a Workflow, Not in Isolation: Torq orchestrates logic, triggers, and external systems. Scripts should be single-purpose functions. Pass inputs via workflow context and send outputs downstream.

  • For Advanced Use Cases, Use Custom Containers: Run scripts using a custom container when you need your own runtime, dependencies, or internal packages.

Did this answer your question?