research – Share your favorite template(s) for capturing user journeys and workflows


Your privacy


By clicking “Accept all cookies”, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy.




SharePoint site collection backup restore with Nintex Forms and workflows

We are using SharePoint on-prem 2016 with Nintex Forms and workflows. I would like to backup a sitecollection and restore on a different farm. If I take a site collection content database backup and restore it on the new farm, will it also have all the Nintex Forms, Nintex Workflows and the Nintex task forms along with this?

Please help me with the right approach to backup and restore SharePoint site collections with Nintex forms and workflows. I do not need the current workflow history or any data from the original site.

8 – How to build a rule reacting to the node’s workflow state change with Drupal 9 Business rules and in-core Workflows modules?

I’m building a publishing workflow with Drupal 9. Unfortunately, Rules and Workflow modules are not yet supported fully by D9. Hence, I decided to give the Business rules as well as in-core Workflows and Content moderation modules a try.

Within the workflow, I need to send an email to the node’s author every time the workflow state is changed. However, I haven’t found any specific related conditions. Any ideas on how to achieve this?

I am unable to create new workflows using sharepoint designer 2013 on new site collections

I am unable to create new workflows using SharePoint Designer 2013 on new site collections.

When I check for errors I am getting All tasks that need to have at least one outcome defined. And when I check to see the task information, there are no items in the Task Outcomes section.

SP environment: SP 2013
Workflow platform type: SharePoint 2013 Workflow

Does anyone have a solution to fixing this?

enter image description here

nintex – Workflow history is not available for the Completed workflows in SharePoint Online

We have Nintex workflow implemented in SharePoint Online.

If the workflow is completed, then there is no data available under Completed workflow.

enter image description here

But if the workflow is In progress then the workflow data is available under running workflow.

Also, this behavior is observed with items that are more than 1 month old.

Can anyone help here to understand what is exactly happening here?

sharepoint enterprise – Starting certain Workflows by certain Usergroups

is it possible that only certain usergroups are allowed to start certain workflows?
As an example I got 3 groups A, B and C and the workflows 1, 2 and 3.

A shall not be able to start any workflow while B can start workflows 1 and C is allowed to start 2 and 3.

views – Workflows within teams or groups

How I go about implementing editorial workflows within a group of users?
I have user identified with a taxonomy term, for example, a per country or per team-based group.

To simplify configuration and avoid adding more extensions, I went about creating a taxonomy, e.g. “Groups”, within that I have the groups, e.g. “Group A” and “Group B” and then each user is related to that group through a field in the user Account Settings.

Team A

  • John Doe – role: Author – taxonomy: Team A
  • Jane Doe – role: Reviewer – taxonomy: Team A

Team B

  • Jake Ryan – role: Author – taxonomy: Team B
  • Joane Ryan – role: Reviewer – taxonomy: Team B

My objective is that when John Doe creates content, the workflow only emails the Reviewer of that group: Jane Doe.

I do struggle to understand the group module. From what I’ve heard about it, I guess it would solve all my problems, but I’m having a lot of difficulties finding information (tutorials, documentation) that would allow me to grasp all the concepts this module introduces.

argo workflows – Dispatch a pod for each entry in CSV

I use Argo Workflows to dispatch lists of jobs defined in a CSV. I accomplish this by chaining a bunch templates together, which involves:

  • Breaking up the CSV file into individual JSON objects
  • Parsing the JSON into parameters
  • Actually passing the parameters to individual pods

The YAML which accomplishes this is:

entrypoint: main

templates:
  - name: main
    steps:
      - - name: get-inputs
          # Produces the complete set of work units from the initial input
          template: split-csv
          arguments:
            artifacts:
              - name: csv-file
                s3: retrieve csv from s3

      - - name: process-each
          # Iterates over the a set of work units produced by the previous step
          template: compute-one
          arguments:
            parameters:
              - name: index
                value: "{{item}}"
            artifacts:
              - name: mappings
                from: "{{steps.get-inputs.outputs.artifacts.json-data}}"
          withSequence:
            count: "{{steps.get-inputs.outputs.parameters.length}}"

  - name: compute-one
    # Processes a single work unit
    inputs:
      parameters:
        - name: index
      artifacts:
        - name: mappings
    steps:
      - - name: get-work-item
          # Retrieves the artifact references that are required to process a single unit of work
          template: get-work-item
          arguments:
            parameters:
              - name: index
                value: "{{inputs.parameters.index}}"
            artifacts:
              - name: mappings
                from: "{{inputs.artifacts.mappings}}"

      - - name: big-compute
          # Where the parameters accessed from the CSV get used.
          template: my-compute-job
          arguments:
            parameters:
              - name: param0
                value: "{{steps.get-work-item.outputs.parameters.param0}}"
              - name: param1
                value: "{{steps.get-work-item.outputs.parameters.param1}}"
              - name: param2
                value: "{{steps.get-work-item.outputs.parameters.param2}}"

  - name: get-work-item
    # From a given JSON array, get the item at `index`, which is expected to be an object,
    # and output the values of its keys to pass as parameters
    inputs:
      parameters:
        - name: index
      artifacts:
        - name: mappings
          path: /tmp/mappings.json
    outputs:
      parameters:
        - name: param0
          valueFrom:
            path: /tmp/param0
        - name: param1
          valueFrom:
            path: /tmp/param1
        - name: param2
          valueFrom:
            path: /tmp/param2
    script:
      image: stedolan/jq
      command: (sh)
      source: |
        jq -r '.({{inputs.parameters.index}}).param0' {{inputs.artifacts.mappings.path}} > {{outputs.parameters.param0.path}}
        jq -r '.({{inputs.parameters.index}}).param1' {{inputs.artifacts.mappings.path}} > {{outputs.parameters.param1.path}}
        jq -r '.({{inputs.parameters.index}}).param2' {{inputs.artifacts.mappings.path}} > {{outputs.parameters.param2.path}}

  - name: split-csv
    # Given a CSV file, convert each row into a JSON-formatted object,
    # and output the list all resulting objects as an artifact,
    # and the length of this list as a parameter
    inputs:
      artifacts:
        - name: csv-file
          path: /tmp/input.csv
    script:
      image: python:alpine
      command: (python)
      source: |
        from csv import reader
        import json

        with open("{{inputs.artifacts.csv-file.path}}", "r") as f:
          rows = reader(f)
          next(rows)
          data = ( {"param0": r(0), "param1": r(1), "param2": r(2)} for r in list(rows) )

        with open("{{outputs.artifacts.json-data.path}}", "w") as f:
          f.write(json.dumps(data))

        with open("{{outputs.parameters.length.path}}", "w") as f:
          f.write(str(len(data)))
    outputs:
      parameters:
        - name: length
          valueFrom:
            path: /tmp/length
      artifacts:
        - name: json-data
          path: /tmp/data.json

Is there a way to do this with fewer steps? I keep needing to paste some form of this into other Argo workflows I create. Is there some way to modularize it and import it?

Retrieve SharePoint 2010/2013 workflows in SharePoint 2013 using CSOM

I need to write CSOM Code to retrieve the SharePoint 2010/2013 workflows. I am able to get only 2013 workflows only. Below is the code:

WorkflowAssociationCollection workflowcollection = list.WorkflowAssociations;

Context.Load(workflowcollection);

context.ExecuteQuery();

foreach(WorkflowAssociation WFassociation in workflowcollection)

{

  workflowName = workflowName+ WFassociation.Name.ToString()+";";

}

But this code is giving only 2013 workflows. How to retrieve 2010 workflows?

Some workflows may not be shown. We were unable to reach the workflow service.SharePOint 2013 workflows issue

I am getting suddenly "Some workflows may not be shown. We were unable to reach the workflow service." error when trying to check on the workflows I created using designer 2013 workflow.I checked the backend service and is running also I checked the workflow service is running fine. I see a error in application event log "6398" "Unexpected exception in FeedCacheService.IsRepopulationNeeded: Failed to Decrypt data" but not sure if this is related to breaking our workflow service. Can any one suggest what should be done to fix it thank you. I am on SharePoint 2016 platform on premise.