Fields not updating during insert

Hi all, when trying to insert records using the Salesforce adaptor version 4.6.2, there are some fields that are not updated despite being in the final JSON before the insert. There is also no error message that indicates the field wasn’t updated. In the JSON screenshot, for example, the field Why_tank_has_no_water__c? is filled, but it is not reflected in Salesforce (Why tank has no water ?). There are other fields where this has happened as well. Do you know what could be the problem? Thank you.


@djohnson strange that there is no error message! Could you (1) confirm which adaptor function you’re using (e.g., insert, bulk) and (2) share a snippet of your job code to see if we can spot any issue?

Then, is there any chance there is a permissions issue or Salesforce-side automation triggered after the OpenFn insert operation?

One test to try… via the web browser, if you manually input the same value (e.g., "No access to water in school") into the Why_tank_has_no_water__c field and then “save” the record, does the record save? Or does Salesforce provide any error message/feedback?

Yeah it’s pretty strange. I’m using insert. Here is the snippet. I chunked the records because I was running into timeout issues when working with the entire array at once.

fn(state => {
    let array = state.filteredNewMETankRecords
    let chunkMETankRecordsArray = []
    const chunkSize = 10;
    for (let i = 0; i < array.length; i += chunkSize) {
        const chunk = array.slice(i, i + chunkSize);
        // do whatever
        chunkMETankRecordsArray.push(chunk)
    }
    if (chunkMETankRecordsArray.length == 0 || chunkMETankRecordsArray == null || chunkMETankRecordsArray == undefined) {
        console.log('No records to add')
    } else {
        console.log(`There are ${chunkMETankRecordsArray[0].length} records to add`)
    }
    return { ...state, chunkMETankRecordsArray }
})

each(
    '$.chunkMETankRecordsArray[*]',
    bulk(
        'ME_tank_information__c',
        'insert',
        { extIdField: state.objIDMapping['ME_tank_information__c'], failOnError: true, allowNoOp: true },
        state => state.data
    )
)

I’m able to update the field in the browser though.

@djohnson I see that in you’re job you’ve specified insert as your operation, but then it looks like you’re passing in an .extIdField… do you maybe want to use upsert instead?

In Salesforce “insert” operations will always create records and do not support checking for existing records using an externalId. If you want to insert or update, then use the “upsert” operation (where you must specify an externalId).

Check out the adaptor docs for more examples, but I think that issue might be that you’re using insert + passing an externalId. Can you try either removing the extIdField argument or changing the operation to upsert?

Or @mtuchi do you spot anything else here that might be off?

I was using insert here because I have to create records. The records are inserted to the ME_tank_information__c object which lives under another object. The record for the parent object is created during the same workflow, so I need to fetch the ID of that newly created record and insert the ME_tank_information__c records there. Is this the right way of achieving this?

@djohnson Not quite… so extIdField is for the target record you’re trying to import. In the below example, Patient_ID__c is the external Id for Patient__c:

bulk(
  "Patient__c",
  "upsert",
  { extIdField: "Patient_ID__c" },
  [
    {
      Patient_ID__c: state.data.patientUid,
      Name: state.data.patientName
    },
  ]
);

When you want to link a child record to the parent, you need to map the relationship in the payload body (e.g., LookupField__r.ParentRecordExtId__c), and not specify it is a separate argument in the bulk() function. For example:

bulk(
  "Patient__c",
  "upsert",
  { extIdField: "Patient_ID__c" },
  [
    {
      Patient_ID__c: state.data.patientUid,
      Name: state.data.patientName,
      //link to Patient to parent HH using external id of parent record
      Household__r.HH_ID__c: state.data.householdId 
    },
  ]
);

Alternatively, you can first query Salesforce to get back the record id of the parent record (e.g., householdId)… and then directly map this when you create the patient - something like this:

bulk(
  "Patient__c",
  "upsert",
  { extIdField: "Patient_ID__c" },
  [
    {
      Patient_ID__c: state.data.patientUid,
      Name: state.data.patientName,
      Household__c: householdId
    },
  ]
);

Check out this Salesforce post for more background info on Salesforce relationship fields and notation for these. And remember that bulk() is using the Salesforce Bulk API in case you want to look at those official docs for more on Salesforce rules.

And again, the bulk() examples in the OpenFn Adaptor docs also show this mapping notation in action.

Thanks for the examples. I will test these and see if the problem disappears.

Hope they helped, @djohnson !

We’ve just added a “mark as solution” button to the forum. If @aleksa-krolls 's examples do the trick would you mind clicking the little grey check box (near the heart) at the bottom of her post so that others can easily find it?

And if not… let us know what errors you’re seeing and we’ll try again :joy:

Thanks Taylor, I’ve marked it as a solution.

1 Like