Data Feeds and Upload Wizards

Data Feeds and the Upload Wizards both provide ways to bulk upload data into paiyroll®:

  • Data Feeds are streamlined workflows for common, regularly-used situations.

  • Upload Wizards are for highly interactive configuration data (e.g. adding new Departments).

Data Feeds

Data Feeds are implemented using Reports:

  • The submission aspect of a Data Feed report implements the logic of the data feed.

  • The viewing aspect of a Data Feed report typically summarises the results of a data feed run.

There are several types of Data Feed using either manual upload of files, or networking to fetch data. To use a Data Feed, create a Report Definition based on the required Report Template.

Data Feed types

Report Template

Manual or Networked

Notes

Universal Upload holiday bookings

Manual or Networked, using CSV

A file containing details of holiday bookings.

Universal Upload salaries

Manual or Networked, using CSV

A file containing details of salaries.

Universal Upload shifts

Manual or Networked, using CSV

A file containing details of shift payments.

GB Upload workers

Manual or Networked, using CSV

A file containing details of workers.

GB Cezanne HR API

Networked

Synchronises with Cezanne HR system.

GB Driver Recruitment Software (DRS) API

Networked

Synchronises with Driver Recruitment Software (DRS) system.

GB PeopleHR API

Networked

Synchronises with PeopleHR system.

GB HMRC DPS

Networked

Downloads updates from HMRC.

GB PAYE tax code changer (P9X)

Manual

Must be invoked after all pay runs for a tax year.

Report Templates for Data Feeds are implemented using different engines depending on the nature of the data feed. These engines provide standardisation of customisability (for CSV-based data), behaviour of the report (for network-centric data) and so on as described here.

CSV file engine

Comma-separated-value files can be generated from any spreadsheet program, and many systems also them to support automation and integration. They can be specified using the following settings:

Columns:

How each output column is generated. The expected output column names depend on the Report Template. The rows of the input CSV file are transformed into the output using a definition language described below.

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

These reports also have both a submission aspect and a view aspect.

Column definition language

A set of columns is defined as follows:

columns             ::= '{' column-definition [',' column-definition]... '}'

column-definition   :: output-column: column-spec

column-spec         ::= fmt [ '->' post-processor ]

output-column       ::= An alphanumeric string, with an optional
                        trailing '='. If the trailing '=' is present,
                        the '=' is discarded, but allows for the `fmt`
                        to be blank (and any `post-processor` is not
                        invoked).

fmt                 ::= A Python format string.

post-processor      ::= A Python expresssion.

The fmt is a Python format string which can include fixed text (such as whitespace) and cell text using replacement fields. The replacement fields use the column headings from the input CSV file as the field_name, with any characters not in the set A-Za-z0-9_ `”£$%^&*()-+=;@’~#|<,>?/ removed. In this example column-definition, the incoming CSV file has a column heading “Worker.Id”, which is transformed into field_name “WorkerId” which is used directly as the value of output-column “Works Id”:

"Works id": "{WorkerId}"

And here, “ref_t”, “row_number”, “Date”, “Client Name”, “Role Name” are transformed, combined and formatted using “[.] , ,” to form the “Display as” output-column value:

"Display as": "[{ref_t}.{row_number}] {Date}, {ClientName}, {RoleName}"

The fmt also allows limited processing of the cell text. For example:

Example fmt usage

fmt Usage

Description

{Email} address

Extract the “Email” cell and combine it with fixed text “ address”.

{Name[0]}

Extract the “Name” cell, and use its first letter only.

\{AccountNumber:>08}

Extract the “AccountNumber” cell, zero fill to a width of 8, and add a preceding (single) “\\”.

The result of the fmt is always stripped of any leading and trailing whitespace.

The optional post-processor is a Python expression used to apply more powerful transformations than the fmt supports. The expression can also use:

  • A variable called “_” initialised with the result of fmt.

  • Variables called “_1”, “_2”, etc. initialised with the result of each placeholder in fmt.

  • Most standard Python builtins, plus re, string, unicodedata, datetime from datetime and utc from datetime.timezone

  • A builtin one() which is like all() or any() but requires exactly one set value.

For example:

Example post-processor usage

post-processor Usage

Description

{AccountNumber} -> '\{:>08}'.format(int(_) * 100)

Extract the “AccountNumber” cell, convert it to an int, multiply that by 100, zero fill to a width of 8, and add a preceding (single) “\\”.

{Date} -> str(datetime.strptime(_, '%d/%m/%Y').date())

Extract the “Date” cell, parse it as %d/%m/%Y datetime, extract the date portion, and convert it into a string (in the default format, yyyy-mm-dd).

"Display as": "{StartDate} -> ['Mo','Tu','We','Th','Fr','Sa','Su'][datetime.strptime(_[:10], '%d/%m/%Y').weekday()]

Extract a day of the week from a date.

{A},{B},{C} -> 'ABC'[_.split(',').index('YES')]

Finds the first of cells A, B and C which is ‘YES’, and returns either ‘A’ or ‘B’ or ‘C’ accordingly.

{First Name} -> unicodedata.normalize('NFD', _).encode('ascii', 'ignore').decode()

This is one possible way to eliminate accented characters. NOTE: this is a lossy process, and not recommended. Also, there are many possible variations of this that you may prefer.

"email": "{Email}@@{Surname}.{Works Id}@example.org -> [e for e in _.split('@@') if e][0]"

Provide a default value if the given value is blank. Here, if “Email” is blank, a default is derived from “Surname” and “Works Id”.

"Units": "{Contract} -> '0' if _.split(',')[0] !='PAYE' else '1'"

Exclude rows without a value of ‘PAYE’ in “Contract” cell by using 1 or 0 skipped for CSV data-feed uploader.

"Display as": "£{Rate1}@{Hours1}hrs £{Rate2}@{Hours2}hrs £{Rate3}@{Hours3}hrs -> _.replace('£0.00@0.00hrs', '').strip()"

Load multiple shifts but exclude those with a value of zero: “£0.00@0.00hrs”.

"Display as": {Client}" ->  _.replace('Limited ', '').replace(' Ltd', '')"

Remove “Limited “ and “ Ltd” in Display as field.

"end_date": "{end_date}  -> _1 if (_1 and _1 <= str(datetime.utcnow().date())) else '' "

Exclude “end_date” if it is after today.

The result of the post-processor is NOT further normalised.

Networked Data engines

This supports the electronic retrieval of data from external agencies. Reports Definitions using these templates typically have some kind of credential settings; where needed these are stored in encrypted form.

These reports also have both a submission aspect and a view aspect.

The submission aspect provides for common error handling. In some cases, the network handling is specific to the data feed in question, but there are cases where a simple file transfer is involved where common handling is possible.

Networked Data engines: file fetch

This data engine is mainly intended to be used by Payroll Debbie for automated fetching of files, nominally daily, but in the context of a pay schedule. The location of the fetched file is parameterised to handle common conventions used in this kind of networked data transfer; this is signified in the following using the notation “{placeholder}”.

Run pattern:

Which days in the pay schedule should Debbie run this? We distinguish 4 parts of the the pay schedule:

  • “Idle” days.

  • The prompt days.

  • The approval days.

  • actual_t.

See Pay Schedule Terminology for details. The run pattern can be set to:

  • Idle days only.

  • Prompt days only.

  • Idle and prompt days.

Note

Of course, the fetch can be performed manually, at any time.

Start time:

From what hour of the day (UTC time) should {date} be set to the date of the fetch? Before this time, the previous day’s date will be used. For example, if the fetch time is “2022-01-02 10:23”, and start time is 11, the default value of {date} will be the previous day “2022-01-01”.

Date adjustment:

By default, the {date} is set from the UTC run time (adjusted as described for start time). This default value may be adjusted forward or backward using this setting. Using the previous example, if the default is “2022-01-01” and the date adjustment is set to +1, the actual value of {date} will be “2022-01-02”.

Fetch URL:

The file to be fetched is specified using a URL. The following formats are supported:

URL format details

Syntax

Details

Secure FTP (SFTP)

Overview:

sftp://[userinfo@]host[:port]/path

Examples without and with {placeholder}:

sftp://hr.com:22/get/workers.csv
sftp://t_n_a.eu/{date}_shifts.csv

Userinfo settings

If the authentication scheme is:

None:

the userinfo is set from the Username setting.

Password:

the userinfo is set from the Username and Password settings.

Private key:

the userinfo is set from the Username setting.

Placeholders

The path can use the following “{placeholder}” values:

{pp_end}:

A date in ISO format. Set to the Pay Period end date using the Pay Schedule of the given Frequency. This is typically used to distinguish files whose content refers to a given pay period.

{date}:

A date in ISO format. Set from the UTC run time of the fetch, adjusted as described for Start time and Date adjustment. This is typically used to fetch files whose content relates to a given day.

Authentication scheme:
Authentication scheme details

Scheme

Details

None.

Uses Username setting. For sftp:, the remote system must have been configured for “passwordless login”.

Password.

Uses Username and Password settings.

Private key.

Uses Username and Private key settings. See below.

Username:

Login name for remote site.

Password:

Credentials.

Private key:

Credentials. To use, the remote system must have been configured with the corresponding public key:

  1. Typically, the paiyroll® administrator, or their IT department, would generate a public-private key pair. For example, using the OpenSSH command line tools like this:

    $ ssh-keygen -f id_<short_descriptive_phrase>
    

    will create a pair of files containing the public key and private key.

  2. These would be held securely.

  3. The public key can be sent using email, or other potentially insecure mechanism, to the remote system’s administrator to set it up.

  4. The private key would be entered in the paiyroll® Report Definition.

Universal Data Feeds

Universal Upload holiday bookings

From CSV

The uploaded file uses Holiday-based Pay Items to describe holiday bookings. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Schedule:

Pay schedule

The file must conform to the following:

  • The content must be CSV format data. The exact style of CSV can be configured using the Report Definition. See CSV file engine for details.

  • The first row must contain headings. Each heading will be converted into an input-column and each resulting input-column must be unique.

  • Each following row must define a single holiday booking, defined using the fields described below. Each field is extracted by transforming the data in the row using a set of input-columns and the Row template which is configured on the Report Definition.

Works id:

The works_id of the worker.

Worker name:

The name of the worker. Only used for diagnostic purposes.

Start date:

Start day of holiday. ISO 8601 format, e.g. “2022-12-31”.

End date:

End day of holiday. ISO 8601 format, e.g. “2022-12-31”.

Start half-day late:

End in the morning. True/False.

End half-day early:

Start in the afternoon. True/False.

Days:

Net days, to 3 places.

Comment:

Submitter comment.

Remark:

Approver remark.

Deleted:

Has this been deleted? True/False.

Examples:

  1. If the CSV file has a column called “Starting date” which is exactly the same as the field “Start date”, the transform template would be simply:

    "{Startingdate}"
    
  2. Complete example:

    {
        "Works id": "{WorkerId}",
        "Worker name": "{WorkerName}",
        "Start date": "{Startingdate}",
        "End date": "{Endingdate}",
        "Start half-day late": "False",
        "End half-day early": "False",
        "Days": "{Days}",
        "Comment": "",
        "Remark": "{Remark}",
        "Deleted": "{Deleted}"
    }
    

    Note how “Start half-day late” and “End half-day early” are hard-coded to “False” because the source system for this file does not support half-day holiday bookings:

From Network

The same as From CSV, except that the file is fetched from the network. Each report definition specifies:

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Schedule:

Pay schedule

Dry run:

If set, discard results and some errors become warnings

Run pattern:

Which days in the pay schedule should Debbie run this?

Start time:

From what hour of the day (UTC time) should Debbie run this?

Date adjustment:

By default, the {date} is set from the UTC run time

Fetch URL:

See format details

Authentication:

Authentication scheme

Username:

Credentials

Password:

Credentials

Private key:

Include “—–BEGIN RSA PRIVATE KEY—–” and “—–END RSA PRIVATE KEY—–”

See Networked Data engines: file fetch for details of the networking settings. Payroll Debbie uses the pay schedule related settings to automatically fetch the file and then initiate the upload of its data.

Warning

Payroll Debbie cannot guarantee the fetch will succeed, so the administrator should always check the uploaded data is as expected, and if necessary, rerun the upload manually.

Universal Upload salaries

From CSV

The uploaded file uses Annual Salary-based Pay Items to describe salaries. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Schedule:

Pay schedule

The file must conform to the following:

  • The content must be CSV format data. The exact style of CSV can be configured using the Report Definition. See CSV file engine for details.

  • The first row must contain headings. Each heading will be converted into an input-column and each resulting input-column must be unique.

  • Each following row must define a single salary, defined using the fields described below. Each field is extracted by transforming the data in the row using a set of input-columns and the Row template which is configured on the Report Definition.

Works id:

The works_id of the worker.

Worker name:

The name of the worker. Only used for diagnostic purposes.

Salary:

Annual salary.

Effective Date:

Date when salary takes effect. ISO 8601 format, e.g. “2022-12-31”.

Deleted:

Has this been deleted? True/False.

Examples:

  1. If the CSV file has a column called “Effective from” which is exactly the same as the field “Effective id”, the transform template would be simply:

    "{Effectivefrom}"
    
  2. Complete example:

    {
        "Works id": "{WorkerId}",
        "Worker name": "{WorkerName}",
        "Salary": "{Salary}",
        "Effective Date": "{Effectivefrom}",
        "Deleted": "{Deleted}"
    }
    

Warning

Care should always be taken with files whose fields can be misinterpreted. For example, incorrect use of spreadsheet editing software can accidentally convert strings such as 000123 into the number 123.

From Network

The same as From CSV, except that the file is fetched from the network. Each report definition specifies:

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Schedule:

Pay schedule

Dry run:

If set, discard results and some errors become warnings

Run pattern:

Which days in the pay schedule should Debbie run this?

Start time:

From what hour of the day (UTC time) should Debbie run this?

Date adjustment:

By default, the {date} is set from the UTC run time

Fetch URL:

See format details

Authentication:

Authentication scheme

Username:

Credentials

Password:

Credentials

Private key:

Include “—–BEGIN RSA PRIVATE KEY—–” and “—–END RSA PRIVATE KEY—–”

See Networked Data engines: file fetch for details of the networking settings. Payroll Debbie uses the pay schedule related settings to automatically fetch the file and then initiate the upload of its data.

Warning

Payroll Debbie cannot guarantee the fetch will succeed, so the administrator should always check the uploaded data is as expected, and if necessary, rerun the upload manually.

Universal Upload shifts

From CSV

The uploaded file uses PayPer-based Pay Items to describe shifts worked. Each report definition specifies:

Pay items:

Regex for the names of the pay items

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Decimal places:

Rounding for Rate and Units

Schedule:

Pay schedule

Dry run:

If set, discard results and some errors become warnings

Skip zeroes:

Ignore, rather than warn on, on zero values

The file must conform to the following:

  • The content must be CSV format data. The exact style of CSV can be configured using the Report Definition. See CSV file engine for details.

  • The first row must contain headings. Each heading will be converted into an input-column and each resulting input-column must be unique.

  • Each following row must define a single worked shift, defined using the fields described below. Each field is extracted by transforming the data in the row using a set of input-columns and the Row template which is configured on the Report Definition. In addition the CSV engine adds predefined input-columns for row_number and ref_t specifying the upload date and, for the Display as only, a shift_number.

Works id:

The works_id of the worker.

Worker name:

The name of the worker.

Rate:

The pay rate as a float.

Units:

The units worked as a float.

Display as:

An arbitrary string identifying the shift.

By default, the Rate and Units values are expected to be non-zero, and rows with zero values are counted and cause a warning. However, the Skip zeroes setting can be used to silently ignore such rows where the CSV normally contains unwanted zero values.

Examples:

  1. If the CSV file has a column called “Candidate Id” which is exactly the same as the field “Works id”, the transform template would be simply:

    "{CandidateId}"
    
  2. If the CSV file has columns called “Shift-location” and “Shift-date”, the “Display as” field could be set to:

    "{row_number}: {CandidateId} worked at {Shiftlocation} on {Shiftdate}"
    
  3. Complete example:

    {
        "Works id": "{WorkerId}",
        "Worker name": "{WorkerName}",
        "Rate": "{HiringRate}",
        "Units": "{TotalHours}",
        "Display as": "{row_number}: {Date}, {ClientName}, {RoleName}"
    }
    

Warning

Care should always be taken with files whose fields can be misinterpreted. For example, incorrect use of spreadsheet editing software can accidentally convert strings such as 000123 into the number 123.

From Network

The same as From CSV, except that the file is fetched from the network. Each report definition specifies:

Pay items:

Regex for the names of the pay items

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Decimal places:

Rounding for Rate and Units

Schedule:

Pay schedule

Dry run:

If set, discard results and some errors become warnings

Run pattern:

Which days in the pay schedule should Debbie run this?

Start time:

From what hour of the day (UTC time) should Debbie run this?

Date adjustment:

By default, the {date} is set from the UTC run time

Fetch URL:

See format details

Authentication:

Authentication scheme

Username:

Credentials

Password:

Credentials

Private key:

Include “—–BEGIN RSA PRIVATE KEY—–” and “—–END RSA PRIVATE KEY—–”

Skip zeroes:

Ignore, rather than warn on, on zero values

See Networked Data engines: file fetch for details of the networking settings. Payroll Debbie uses the pay schedule related settings to automatically fetch the file and then initiate the upload of its data.

Warning

Payroll Debbie cannot guarantee the fetch will succeed, so the administrator should always check the uploaded data is as expected, and if necessary, rerun the upload manually.

GB Data Feeds

GB Upload workers

From CSV

The uploaded file adds or updates workers. Each report definition specifies:

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Add from:

A test employee from which to seed pay items for new workers

Invite:

Send password set welcome email

Auto works id:

Automatically generate sequential works id

Dry run:

If set, discard results and some errors become warnings

Disable employee updates:

Disable employeee self-service updates

The file must conform to the following:

  • The content must be CSV format data. The exact style of CSV can be configured using the Report Definition. See CSV file engine for details.

  • The first row must contain headings. Each heading will be converted into an input-column and each resulting input-column must be unique.

  • Each following row must define a single worker, define using the fields described below. Each field is extracted by transforming the data in the row using a set of input-columns and the Row template which is configured on the Report Definition. In addition the CSV engine adds a predefined input-column for row_number.

The available fields are country dependent, but fall into the following groups:

<common>:

Country-independent fields such as “title”, “name” and “works_id”. Existing workers are matched by the “username” field and updated. If no existing worker is matched, a new worker is created.

<postaldetail>:

Postal details, prefixed by “postaldetail.”.

<taxdetail>:

Employer-specified tax details, prefixed by “taxdetail.”.

<taxeedetail>:

Worker-specified tax details, prefixed by “taxeedetail.”.

<bankdetail>:

Banking details, prefixed by “bankdetail.”.

By default, the file overwrites every field of any existing worker. But, depending on the completeness of the source data, this is not desirable after an initial upload because certain fields are effectively managed, and may have been updated, using paiyroll®. In these situations, the output-column specification can include a trailing “=” character. If present, the trailing “=”:

  • Allows the input-columns to contain a blank value for existing workers. For new workers, non-blank values must continue to be supplied (for output-columns without a default value).

  • Preserves the existing value on the worker.

Examples:

  1. Working days is an array of boolean values that specifies the weekly working pattern. Notice how the “working_days” are specified using doubled “{{” and “}}” to escape interpretation as a key name:

    "{{true,true,true,true,true,true,true}}"
    
  2. Complete example:

    {
        "email": "{EmailAddress}",
        "is_active": "True",
        "title": "",
        "name": "{WorkerName}",
        "mobile": "+{PhoneCountryCode}{PhoneNumber}",
        "works_id": "{WorkerId}",
        "start_date": "2019-06-01",
        "working_days": "{{true,true,true,true,true,true,true}}",
        "end_date": "",
        "department": "Washers",
        "postaldetail.line1": "{AddressLine1}",
        "postaldetail.line2": "{AddressLine2}",
        "postaldetail.line3": "{Town}",
        "postaldetail.line4": "{County}",
        "postaldetail.postcode": "{Postcode}",
        "taxdetail.weekly_hours_worked": "Other",
        "taxdetail.P45_tax_code": "",
        "taxdetail.P45_week1_month1": "False",
        "taxdetail.P45_leaving_date": "",
        "taxdetail.P45_total_pay_to_date": "",
        "taxdetail.P45_total_tax_to_date": "",
        "taxdetail.P45_continue_student_loan": "0",
        "taxdetail.P45_continue_postgraduate_loan": "0",
        "taxdetail.NI_category": "General",
        "taxdetail.director": "False",
        "taxdetail.director_from": "",
        "taxdetail.director_NI_cumulative": "False",
        "taxdetail.paid_irregularly": "True",
        "taxeedetail.date_of_birth": "{DateofBirth}",
        "taxeedetail.gender": "{Gender}",
        "taxeedetail.NI_number": "{NINumber}",
        "taxeedetail.employee_statement": "",
        "taxeedetail.student_loan": "0",
        "taxeedetail.postgraduate_loan": "0",
        "bankdetail.account_name": "{WorkerName}",
        "bankdetail.sort_code": "000000",
        "bankdetail.account_number": "00000000"
    }
    

Warning

Care should always be taken with files whose fields can be misinterpreted. For example, incorrect use of spreadsheet editing software can accidentally convert strings such as 000123 into the number 123.

From Network

The same as From CSV, except that the file is fetched from the network. Each report definition specifies:

Columns:

How each column is generated

CSV dialect:

See https://docs.python.org/library/csv.html#csv-fmt-params

CSV fmtparams:

See https://docs.python.org/library/csv.html#csv-fmt-params

Add from:

A test employee from which to seed pay items for new workers

Invite:

Send password set welcome email

Auto works id:

Automatically generate sequential works id

Disable employee updates:

Disable employeee self-service updates

Schedule:

Pay schedule

Dry run:

If set, discard results and some errors become warnings

Run pattern:

Which days in the pay schedule should Debbie run this?

Start time:

From what hour of the day (UTC time) should Debbie run this?

Date adjustment:

By default, the {date} is set from the UTC run time

Fetch URL:

See format details

Authentication:

Authentication scheme

Username:

Credentials

Password:

Credentials

Private key:

Include “—–BEGIN RSA PRIVATE KEY—–” and “—–END RSA PRIVATE KEY—–”

See Networked Data engines: file fetch for details of the networking settings. Payroll Debbie uses the pay schedule related settings to automatically fetch the file and then initiate the upload of its data.

Warning

Payroll Debbie cannot guarantee the fetch will succeed, so the administrator should always check the uploaded data is as expected, and if necessary, rerun the upload manually.

GB Cezanne HR API

Synchronises with Cezanne HR system via API calls. See Integrate with Cezanne HR API for usage notes. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

Client Id:

Id for authentication

Client Secret:

Secret for authentication

Authorisation:

Enable delegated access

Excluded:

Contract Term(s) to exclude, semi-colon separated

Add from:

A test employee from which to seed pay items for new workers

Invite:

Send password set welcome email

Schedule:

Pay schedule

Companies:

Semi-colon separated companies, blank for all

Rewind:

If set, starts fetch from 3 months ago. Default starts from last run

Sync from:

A test employee from which to seed pay items for salaries and absences

Working days:

If set, Cezanne HR controls working days

Enable Sickness:

Clear to disable Sickness fetch

Enable Holidays:

Clear to disable Holiday fetch

GB Driver Recruitment Software (DRS) API

Synchronises with Driver Recruitment Software (DRS) system via API calls. See Integrate with Driver Recruitment Software API for usage notes. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

API endpoint:

URL allocated to you by DRS

Username:

Credentials

Password:

Credentials

Schedule:

Pay schedule

Run pattern:

Which days in the pay schedule should Debbie run this?

Start time:

From what hour of the day (UTC time) should Debbie run this?

Shifts:

Shift pay items e.g. Shift {shift_number}

Expenses:

Expenses pay items e.g. Expenses {shift_number}

VAT:

VAT pay items e.g. VAT {shift_number}

Other:

Other pay items e.g. Other {shift_number} (leave blank if not used)

Other2:

Other2 pay items e.g. Other2 {shift_number} (leave blank if not used)

Other3:

Other3 pay items e.g. Other3 {shift_number} (leave blank if not used)

Holiday:

Holiday pay item

Excluded:

Contract Type(s) to exclude, semi-colon separated

Add from:

A test employee from which to seed pay items for new workers

Invite:

Send password set welcome email

GB PeopleHR API

Synchronises Workers, Salary, Holiday, Recurring Benefits, Ad-hoc Payments, Ad-hoc Deductions and Pensions from an external PeopleHR system via API calls. See Integrate with PeopleHR API for usage notes. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

Add from:

A test employee from which to seed pay items for new workers

Invite:

Send password set welcome email

API key:

API key for authentication

Salary query:

Blank query disables salary updates

Holiday query:

Blank query disables holiday updates

Excluded:

Employment Type(s) to exclude, semi-colon separated

Schedule:

Pay schedule

Companies:

Semi-colon separated companies, blank for all

Sync from:

A test employee from which to seed pay items for salaries and absences

Rewind:

If set, starts fetch from 3 months ago. Default starts from last run

Recurring Benefit query:

Blank disables updates

Recurring Benefit query columns:

How to configure Recurring Benefit mapping

Single Payment query:

Blank disables updates

Single Payment query columns:

How to configure Single Payment mapping

Single Deduction query:

Blank disables updates

Single Deduction query columns:

How to configure Single Deduction mapping

Pension query:

Blank disables updates

Pension query columns:

How to configure Pension mapping

GB HMRC CIS Verification

This Data Feed’s submission aspect fetches Construction Industry Scheme tax information from HMRC and applies them to workers. It is broadly analagous GB HMRC DPS for normally taxed workers. Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

CIS username:

Alternatively, RTI username can be used

CIS password:

Alternatively, RTI password can be used with RTI username

GB HMRC DPS

This Data Feed’s submission aspect fetches PAYE and tax code (P6 and P9), student loan and post-graduate loan (SL and PGL) updates from HMRC’s Data Provisioning Service and applies them to workers. Other notifications are also fetched and stored.

Before you can use this service, all notice options need to be set to online in your HMRC PAYE services. To change notice options, navigate to:

Messages > PAYE for employers messages > Change how you get tax code and student loan notices

Ensure all options are set to YES and press NEXT (noting that only future notices issued after this change will be available online for this data feed):

../../_images/change_notice_options.png

Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

DPS username:

Government Gateway user ID

DPS password:

Password

Rewind:

If set, starts fetch from 3 months ago. Default starts from last run

The results of the submission aspect are recorded and can be seen in the view aspect using the history on My Data Feeds:

  • PAYE updates by worker:

    1762: {'PAYE.Tax code': '944L', 'PAYE.Effective date': '2014-05-18', 'PAYE.Previous pay': '2200.00', 'PAYE.Previous tax': '315.00'}
    
  • Tax code updates by worker:

    1762: {'PAYE.Tax code': '944L', 'PAYE.Effective date'}
    
  • SL updates by worker:

    1762: {'SL.LoanStartDate': '2016-07-17', 'SL.Plan type': '02'}
    1768: {'SL.StopDate': '2014-02-06'}
    2762: {'SL.LoanStartDate': '2019-04-10', 'SL.PGL': True}
    
  • NI number updates by worker:

    1762: 'SH445566A'
    2762: ''
    
  • Other notifications in generic form:

    {
        'date': '2013-09-23', 'title': 'Late filing notice',
        'text': "Our records indicate that your Full Payment Submission (FPS) dated 15/09/2013 was sent late. You must send an FPS on, or before, the time you pay your employees. In future, please ensure that your FPSs are sent on time. From April 2014, you may incur penalties if you file your FPSs late.\nIf you think this notice is incorrect please contact HMRC's Employer Helpline.",
        'content': {
            'Information': {
                'InformationItem': {
                    'DisplayName': 'Date of Receipt', 'Value': '2013-09-15'
                }
            }
        }
    }
    {
        'date': '2013-09-23', 'title': 'Invalid DOB',
        'text': 'Change DOB 03/02/1975 to 03/02/1982, DOB 02/08/1990 to 12/08/1990, DOB 31/09/1985 to 21/09/1985 for Shawcross.',
        'content': {
            'Change': {
                'ActionItem': {'DisplayName': 'Date of Birth', 'PreviousValue': '1990-08-02', 'Value': '1990-08-12'}
            }
        }
    }
    {
        'EmployerDetails': {
            'EmployerRef': '123/1739465',
            'HMRCoffice': {...},
            'Name': "Andy's Joinery Services", 'Address': {'Line': 'Any City', 'PostCode': 'CC22 2CC'},
            'AORef': '234PL5678901'
        },
        'NotificationText': "Notification to complete a PAYE Annual Return. This notice requires you...",
        'FormType': 'P35', 'HeaderText': 'PAYE instructions to employers', 'IssueDate': '2013-05-02',
        'SequenceNumber': '3226', 'TaxYearEnd': '2014'
    }
    

GB PAYE tax code changer (P9X)

This data feed updates tax codes at the end of the tax year. It must be invoked after all pay schedules for a Company have had their last pay run for the tax year, and before the first pay runs of the new tax year.

Each report definition specifies:

Dry run:

If set, discard results and some errors become warnings

The results of the submission aspect are recorded and can be seen in the view aspect using the history on My Data Feeds, tax code updates by worker:

1762, '944L', '920L'

Upload Wizards

There are Upload Wizards for most objects such as:

  • Employee

  • Department

The data supplied by the uploaded file is different in each case, and listed when the Upload button is selected on the relevant object listing page. (See Importers for information on uploading an entire company structure).

Generally, the interactive nature of the wizard facilitates review of the data being imported, and so reduces the chance of error. For most fields, the format of the data being uploaded is clear from the context.

Warning

Care should always be taken with files whose fields can be misinterpreted. For example, incorrect use of spreadsheet editing software can accidentally convert strings such as 000123 into the number 123.

See also paiyroll® Importers which an entire company structure to be uploaded and The Toolshed which supports repeated and specialised changes.

Foreign keys

However, fields which represent links in the database (i.e. foreign keys) must convey extra information to allow the linkage to be created. This is done using a “;” to split the field as needed:

value1;value2

For example, the Employee upload must specify:

title:

This is a link to the Title, and so is simply specifed as the text equivalent of the Title, for example:

Mr.
company:

This is a link to a Company, and must be specified using the text equivalent of the Client and Company, for example:

ACME group;Drilling Ltd
department:

This is a link to a Department, and must be specified using the text equivalent of the Company and the Department. However, since the Company must be specified as above, this second level of “;” must be escaped:

ACME group\;Drilling Ltd;Audit and Accounts