Export data

There are three ways to export data from Next Matter

If you need to export data from Next Matter to an external tool or file it away, you can use one of the three possible options:

  • Download data as CSV file (Data > Workflow data)
  • Use Next Matter API and apply the script provided below
  • Use an integration step to export data to a 3rd party tool, such as: Google Sheets, Redshift, or PostgreSQL
  • Custom export - click Contact us (top-right corner), and tell us what you need

Download as CSV

With the CSV file, you can download the workflow metadata and values fetched or provided in the workflow. The exported data are the following:

  • ID
  • name
  • priority
  • tags
  • deadline
  • started time
  • completed time
  • last updated time
  • aborted time
  • lead user
  • stage ID
  • stage name
  • data for each step: activated time, assigned time, completed time, completed by
  • additionally for integration steps: integration step error, integration step success, integration step debug errors, analytics info (execution duration), user variables, output summary

If a field doesn't apply, then its value field is empty.

πŸ“˜

You can only export data from one workflow per hour.

How to download a CSV

Download with a script

The script downloads data and exports them to a CSV file. The script creates columns for workflow ID, instance ID, instance name, and the names of the form fields to be exported. All the data from the form fields are also exported.

Python script

To run this script, you need to get the following:

  • Next Matter API key

  • Workflow ID of the workflow you want to export

  • Step ID of the step to export

  • Form field ID of the step you want to export, the name of the field, and its value type

import requests
import pandas as pd
import time

# Constants

API_KEY = "API KEY"
ROOT_URL = "https://core.nextmatter.com/api"
HEADERS = {"Content-Type": "application/json", "Authorization": f"Api-Key {API_KEY}"}

def fetch_data(session, workflow_id, step_id, form_field_id):
    """
    Fetch data from the API for a given workflow, step, and form field.
    """
    request_url = f"{ROOT_URL}/processes/{workflow_id}/steps/{step_id}/actions/{form_field_id}/"
    response = session.get(request_url)
    return response.json()

def process_response(response, form_field_type):
    """
    Process the response to extract needed values.
    """
    data_values = []
    for result in response['results']:
        instance_id = result['instance_id']
        instance_name = result['instance_name']
        value = str(result['value'][form_field_type])
        data_values.append((instance_id, instance_name, value))
    return data_values

def update_dataframe(df, workflow_id, form_field_names, instance_id, instance_name, form_field_name, value):
    """
    Update the DataFrame with new values or add new rows.
    """
    if instance_id in df['Instance Id'].values:
        df.loc[df['Instance Id'] == instance_id, form_field_name] = value
    else:
        new_row = {'Workflow Id': workflow_id, 'Instance Id': instance_id, 'Instance Name': instance_name, form_field_name: value}
        for field_name in form_field_names:
            if field_name != form_field_name:
                new_row[field_name] = None
        df.loc[len(df)] = new_row
    return df

def main():
    data_to_export = [
        {
            "workflow_id": WORKLOW_ID, //**replace with workflow ID**//
            "inputs_to_export": [
                {
                    "step_id": STEP_ID, //**replace with step ID**//
                    "form_field_id" : FORM_FIELD_ID, //**replace with form field ID**//
                    "form_field_name" : "FIELD1_NAME", //**replace with name of the field**//
                    "form_field_type": "inputValue"
                },
                {
                    "step_id": STEP_ID,
                    "form_field_id" : FORM_FIELD_ID,
                    "form_field_name" : "FIELD2_NAME",
                    "form_field_type": "inputValue"
                }
            ]
        }
    ]

    session = requests.Session()
    session.headers.update(HEADERS)

    # Extract all unique form_field_names and create the DataFrame
    form_field_names = [field['form_field_name'] for workflow in data_to_export for field in workflow['inputs_to_export']]
    columns = ['Workflow Id', 'Instance Id', 'Instance Name'] + form_field_names
    df = pd.DataFrame(columns=columns)

    for workflow_to_export in data_to_export:
        workflow_id = workflow_to_export["workflow_id"]
        datapoints = workflow_to_export["inputs_to_export"]

        for field in datapoints:
            step_id = field["step_id"]
            form_field_id = field["form_field_id"]
            form_field_name = field["form_field_name"]
            form_field_type = field["form_field_type"]

            response = fetch_data(session, workflow_id, step_id, form_field_id)
            data_values = process_response(response, form_field_type)

            for instance_id, instance_name, value in data_values:
                df = update_dataframe(df, workflow_id, form_field_names, instance_id, instance_name, form_field_name, value)

            time.sleep(1)

    df.to_csv("output.csv", index=False)

if __name__ == "__main__":
    main()

Export using integration steps

You can send workflow input values and workflow metadata such as workflow name, deadline, lead, tags, or start time directly from each workflow.