Tech Blog

Working with the Alma Jobs API

It is possible to submit jobs and check the results using Alma APIs.
The Jobs API supports 2 job types:
  • Manual jobs
    Perform actions on a pre-defined set of records. Available in the “Run a job” list in the Alma UI.
  • Scheduled jobs
    Jobs that might be running periodically. In the Alma UI, it is possible to see these jobs (if they have a defined schedule) in the Scheduled tab of the Monitor Jobs page
Following is detailed description of the required steps for submitting a job and viewing its results report.

Submitting a manual job

  1. Define the set of records.
    The set can be of several record types – e.g. items, po lines. See OLH for more details. Note that a set can also be created using the Create a set API.
  1. Identify the relevant parameters
    In order to submit a manual job, you must supply the relevant job parameters as part of the API payload. Identifying the parameters is done as following:
    In the “run a job” wizard, select the job that you would like to run (e.g. “Export Bibliographic Records”) and choose its relevant set and parameters. Do not actually run it in Alma UI!
    In step 5 of the wizard, review the “API information” section. In this section you will find the URL and payload for the job submission:

From your application, submit the job using this URL and payload as specified in the “API information” section.
Note: identifying the parameters is a onetime step that is done before adding the submission API call to your application. Once you identified the parameters, you can submit the job as many times as you need.
  1. Get the job instance report
    The output of the submission API includes information regarding the job instance:
The instance job number can be used in Alma UI in order to see the status (as well as additional information) regarding the submitted job. It can also be used as an API:
GET /almaws/v1/conf/jobs/{job id}/instances/{instance job id}
In the example above:
GET /almaws/v1/conf/jobs/M26710920000011/instances/16197909070001021

Submitting a scheduled job

Currently, submission of scheduled jobs via API is supported for the following jobs:

  1. Define the related profile.
    In order to run the above jobs, a profile should be defined including the required definitions. For import it is an import profile, for the other jobs it is a related integration profile.
    Note that it is possible to define a profile as active, and leave the schedule not defined.
  2. Find the job ID that should be used for submitting the job.
    The best way to do this is to use the profile id as a filter:

    GET /almaws/v1/conf/jobs?type=SCHEDULED&profile_id={relevant profile id}

    The profile ID can be taken from the Alma UI, or be retrieved using the APIs: Get integration profiles or GET import profiles

  3. Now the job can be submitted:

    POST /almaws/v1/conf/jobs/{job id}?op=run

    as a payload, an empty job structure should be sent:

    <job/>

    In case you use content-type=application/json simply include an empty JSON object: {}

    The output of the submission API includes information regarding the job instance (same as for a manual job).

Threshold 

Note that there is a threshold on submitting jobs using the API: The job will run only if:

  • Not more than 3 jobs initiated by the API are running currently.
  • Not more of 5 of the specific job were started in the previous hour.

5 Replies to “Working with the Alma Jobs API”

  1. I keep getting “Invalid job id” in the return XML when doing #3 above, even though I know it’s the correct job id for our SIS synchronize job.

  2. I suggest to perform:
    GET /almaws/v1/conf/jobs?type=SCHEDULED&profile_id={relevant profile id}
    and check the retrieved job id, to make sure that the job id that you have is correct.

    Tamar

  3. This *can* be done via JSON as well. Steps in python 3 to setup the body:

    parameters = {
    ‘task_ExportParams_outputFormat_string’: ‘CSV’,
    ‘task_ExportParams_exportFolder_string’: ‘PRIVATE’,
    ‘task_ExportParams_ftpConfig_string’: ‘1173652550006676’,
    ‘task_ExportParams_ftpSubdirectory_string’: ‘conservt’,
    ‘set_id’: set_id,
    ‘job_name’: ‘Export Conservation Tracking Items’
    }

    body = {
    ‘parameter’: [
    {‘name’: {‘value’: name, ‘desc’: name}, ‘value’: value}
    for name, value in parameters.items()
    ]
    }

  4. I’d like to run a “Reload and Delete” process on a Discovery Import Profile via API. Is this possible? Since Reload and Delete processes can’t be scheduled, we’re stuck having to do this manually each time a vendor updates their record set. Of course, being able to schedule the job would be even better, but we could use an API call if this is possible.

Leave a Reply