Apex Coding Interview Challenge #11

Given an integer k and a string s, find the length of the longest substring that contains at most k distinct characters.

For example, given s = “abcba” and k = 2, the longest substring with k distinct characters is “bcb”.

Solution

public static String getLongestSubstringDistinct(String str, Integer k){
    Integer n = str.length();
    
    Integer left = 0;
    Integer right = 0;
    
    Map<Integer, Integer> mapOfChars = new Map<Integer, Integer>();
    
    while (right < n){
        if (mapOfChars.size() < k+1){
            if (!mapOfChars.containsKey(str.charAt(right))){ 
               mapOfChars.put(str.charAt(right), right);           
            } else if (right-mapOfChars.get(str.charAt(right))<=1){
                mapOfChars.put(str.charAt(right), right);  
            }
              
            right++;
        }
        System.debug('mapOfChars > ' + mapOfChars);
        
        if (mapOfChars.size() == k+1){
            List<Integer> mapOfValues = mapOfChars.values();
            mapOfValues.sort();
            Integer leftMax = mapOfValues.get(0);
            
            mapOfChars.remove(str.charAt(leftMax));
            left = leftMax + 1;
        }
    }
    
    List<Integer> charStartEnd = mapOfChars.values();
    charStartEnd.sort();
    return str.subString(charStartEnd.get(0)-1, charStartEnd.get(1));
}

Testing

System.debug(getLongestSubstringDistinct('abcba', 2)); //bcb
System.debug(getLongestSubstringDistinct('zxybobc', 2)); //bob

Apex Coding Interview Challenge #1

This question was asked during an Amazon interview

Following schema is provided:

Account

Total_Salary__c (Number)

Max_Salary__c (Number)

Account_Salary__c

Account__c (lookup)

Name (String)

Salary__c (Number)

An Account can have multiple Account_Salary__c records that lookup to an Account by the Account__c.

Write a trigger that would update the Account Total_Salary__c, Max_Salary__c when a new Account salary record is:

  1. Inserted
  2. Update
  3. Deleted
  4. Undeleted

Declarative Programming solution

  1. Master Detail relationship between Account and Account_Salary__c, use sum(Salary__c) and max(Salary__c) to do rollup to Account
  2. Process Builder or Flow that calls out to @InvocableMethod to query all other Account_Salary__c records and makes the update

Imperative Programming solution 

Trigger

trigger AccountSalaryTrigger on Account_Salary__c (after insert, after update, after delete, after undelete) {
    if (Trigger.isUpdate){
        AccountSalaryHelper.updateAccount(Trigger.new, Trigger.oldMap);
    } else if (Trigger.isDelete){
       AccountSalaryHelper.updateAccount(Trigger.old, null);
    } else {
       AccountSalaryHelper.updateAccount(Trigger.new, null);
    }
}

Helper class

public with sharing class AccountSalaryHelper {

    public static void updateAccount(List<Account_Salary__c> newAccountSalaries, Map<Id, Account_Salary__c> oldMap){
        Set<Id> accountIds = new Set<Id>();
        for (Account_Salary__c newAccountSalary : newAccountSalaries){
            if (oldMap!=null){
                Account_Salary__c oldAccountSalary = oldMap.get(newAccountSalary.Id);
                if (oldAccountSalary.Salary__c != newAccountSalary.Salary__c){
                    accountIds.add(newAccountSalary.Account__c);
                }
            } else {
                accountIds.add(newAccountSalary.Account__c);
            }
        }
        
        if (!accountIds.isEmpty()){
            List<AggregateResult> aggResults = [Select Account__c accId, sum(Salary__c) sumSalary, max(Salary__c) maxSalary from Account_Salary__c where Account__c IN :accountIds Group By Account__c];
            
            List<Account> accountsToUpdate = new List<Account>();
            for (AggregateResult aggResult : aggResults){
                Id accountId = (Id)aggResult.get('accId');
                if (accountId!=null){
                    Account updateAccount = new Account();
                    updateAccount.Id =accountId;
                    updateAccount.Total_Salary__c=(Decimal)aggResult.get('sumSalary');
                    updateAccount.Max_Salary__c = (Decimal)aggResult.get('maxSalary');
                    accountsToUpdate.add(updateAccount);
                }
            }
            
            if (!accountsToUpdate.isEmpty()){
                SavePoint sp = Database.setSavePoint();
                try{
                    update accountsToUpdate;
                } catch(DMLException ex){
                    Database.rollback(sp);
                }
            }
        }
    }
}

Helper Test class

@isTest
private class AccountSalaryHelperTest {

    @TestSetup static void setup(){
        Account acc = new Account();
        acc.Name = 'Test';
        insert acc;
    
        Account_Salary__c as1 = new Account_Salary__c();
        as1.Name='as1';
        as1.Account__c = acc.Id;
        as1.Salary__c = 500;
        insert as1;
        
        Account_Salary__c as2 = new Account_Salary__c();
        as2.Name = 'as2';
        as2.Account__c = acc.Id;
        as2.Salary__c = 700;
        insert as2;
    }

    @isTest static void testInsertAccountSalary(){
        Account acc = [Select Id, Max_Salary__c, Total_Salary__c from Account][0];
        
        Test.startTest();
            Account_Salary__c as3 = new Account_Salary__c();
            as3.Name = 'as3';
            as3.Account__c = acc.Id;
            as3.Salary__c = 300;
            insert as3;
        Test.stopTest();
        
        Account accAfter = [Select Id, Max_Salary__c, Total_Salary__c from Account][0];
        System.assertEquals(accAfter.Max_Salary__c, 700);
        System.assertEquals(accAfter.Total_Salary__c, 1500);
    }
    
    @isTest static void testUpdateAccountSalary(){
        Account_Salary__c accSalary = [Select Id, Salary__c from Account_Salary__c where Name='as2'][0];
         Test.startTest();
            accSalary.Salary__c = 800;
            update accSalary;
        Test.stopTest();
        
        Account acc = [Select Id, Max_Salary__c, Total_Salary__c from Account][0];
        System.assertEquals(acc.Max_Salary__c, 800);
        System.assertEquals(acc.Total_Salary__c, 1300);
    }
    
    @isTest static void testDeleteAccountSalary(){
        Test.startTest();
            delete [Select Id from Account_Salary__c where Name='as2'][0];
        Test.stopTest();
        
        Account acc = [Select Id, Max_Salary__c, Total_Salary__c from Account][0];
        System.assertEquals(acc.Max_Salary__c, 500);
        System.assertEquals(acc.Total_Salary__c, 500);
    }
}

Follow up question
1. How can we make the trigger more dynamic so when a new field is added it would do the max and sum on account

Answer:
Create a custom metadata mapper table that would contain the SOQL query values and then related Account mapped fields. Create a dynamic SOQL query reading the fields that needs to be queried from custom metadata. Use the SObject set method to set the field values. Account.put(‘Total_Salary_Count__c’, (Decimal)aggResult.get(‘countSalary’));

Salesforce System Integration Interview Prep

Remote Process Invocation – Request and Reply

Best solutions

  1. External Services – invokes a REST API call – allows to invoke externally hosted service in declarative manner. External REST service are available in a an OpenAPI or Integrant schema definition. Contains primitive data types, nested objects not supported. Transaction invoked from Lightning Flow. Integration transaction doesn’t risk exceeding the synchronous Apex governor limits.
  2. Salesforce Lighting – consume WSDL and generate Apex proxy. Enable HTTP (REST) services, GET, PUT, POST, DELETE methods.
  3. Custom Visualforce page or button initiates Apex HTTP callout – user initiated action calls and Apex action that executed this proxy Apex class

Suboptimal

  1. Trigger – calls must be asynchronous (@future)
  2. Batch job callout – bundle responses together and make 1 callout for every 200 records processed.

Endpoint capability

Endpoint should be able to receive a web services call via HTTP. Salesforce must be able to access endpoint via the internet, curl or postman. Apex SOAP callout- WSDL 1.1, SOAP 1.1

Apex HTTP callout -REST service using standard GET, POST, PUT, DELETE methods

Data Volumes

Small volume, real time activities, due to small timeout values and maximum size of the request or response from Apex call solution.

Timeliness

  1. Request is typically invoked from user interface, must not keep user waiting
  2. Governor limit of 120 second timeout for callouts
  3. Remote process needs to be completed in SF limit and user expectations else seen as slow system
  4. Apex synchronous apex limits are 10 transaction than run for 5 seconds
    • Make sure external server callout is less than 5 seconds

State management

  1. Salesforce stores the external system External Id for specific record
  2. The remote system stores the Salesforce unique record ID or other unique key

Security

Any call to external service should maintain:

  1. Confidentiality
  2. Integrity
  3. Availability

Apex SOAP and HTTP callouts security considerations

  1. One way SSL is enabled by default
  2. 2 way SSL is enabled by self-signed certificate or CA-signed certificate
  3. WS-Security is not supported
  4. If necessary use one way hash or digital signature using Apex Crypto to ensure message integrity
  5. Remote system must be protected by appropriate firewall mechanism

Error Handling – error occurs in form of error http error codes

400 – Bad request, 401 – Unauthorized, 404 – Not Found, 500 – Internal server error

Recovery – changes not committed to Salesforce until successful response, retry

x amount of times until success received else log error.

Idempotent Design consideration – duplicate calls when something goes wrong needs

need to have a message ID to make sure duplicates are not created

Http h = new Http();
HttpRequest req = new HttpRequest();
req.setEndpoint(url); //
req.setMethod('GET'); //

//Basic Authentication - specify in Named Credentials so no code change needed
String username = 'myname';
String password = 'mypwd';

Blob headerValue = Blob.valueOf(username + ':' + password);
String authorizationHeader = 'Basic ' +
EncodingUtil.base64Encode(headerValue);
req.setHeader('Authorization', authorizationHeader);

HttpResponse response = h.send(req);

if (response.getStatusCode() == 200) {
    Map&amp;amp;amp;lt;String, Object&amp;amp;amp;gt; results = (Map&amp;amp;amp;lt;String, Object&amp;amp;amp;gt;) JSON.deserializeUntyped(response.getBody());
    List&amp;amp;amp;lt;Object&amp;amp;amp;gt; animals = (List&amp;amp;amp;lt;Object&amp;amp;amp;gt;) results.get('animals');
    System.debug('Received the following animals:');
    for (Object animal: animals) {
        System.debug(animal);
    }
}

//Implement httpCalloutMock
global class YourHttpCalloutMockImpl implements HttpCalloutMock {
    global HTTPResponse respond(HTTPRequest req) {
    

Test.setMock(HttpCalloutMock.class, new YourHttpCalloutMockImpl());

Remote Process Invocation – Fire and forget

Best solutions

  1. Process-driven platform events – using process builder/workflow rules to create process to publish insert or update event
  2. Event messages – the process of communicating changes and responding to them without writing complex logic. One or more subscribers can listen to the same event and carry out actions through CometD.
  3. Customized-driven platform events – events are created by Apex triggers or batch classes
  4. Workflow driven outbound messaging – remote process is invoked from an insert or update event. Salesforce provides workflow-driven outbound messaging capability that allow sending SOAP messages to remote systems.
  5. Outbound messaging and callbacks – callback helps mitigate out-of-sequence messaging also 2 things Idempotency and Retrieve of more data. When creating a new record and update externalId with the recordId from external system.

Suboptimal

  1. Lightning components – user interface based scenarios, must guarantee delivery of message in code
  2. Triggers – Apex triggers to perform automation based on records changes
  3. Batch jobs – calls to remote system can be performed from a batch job. Solution allows batch remote process execution and for processing of reposes from remote system.
MyEventObject__e newEvent = new MyEventObject__e();
newEvent.Name__c = 'Test Platform Event';
newEvent.Description__c = 'This is a test message';
List&lt;Database.SaveResult&gt; results = EventBus.publish(newEvent);

	
Test.startTest();
// Create test events
// ...
// Publish test events with EventBus.publish()
// ...
// Deliver test events
Test.getEventBus().deliver();
// Perform validation 
// ...
Test.stopTest();


//Listener - EMPConnector use login to and create EmpConnector(bayeuxParameters)
import static com.salesforce.emp.connector.LoginHelper.login;
//create consumer
SalesforceEventPayload eventPayload = new SalesforceEventPayload();
	Consumer&lt;Map&lt;String, Object&gt;&gt; consumer = event -&gt; {
			eventPayload.setPayload(event);
			itsReqHandler.handleRequest(eventPayload.getPayload());
};

//add subscription to event
subscription = empConnector.subscribe("/event/" + EVENT_NAME, replayFrom, consumer).get(WAIT_TIME, TimeUnit.SECONDS);

Error handling and recovery

Error handling – because this pattern is asynchronous the callout of event publish handles need to be handles as well as the remote system needs to handle queuing, processing and error handling.

Recovery

More complex, need some retry strategy if no retry is receive in the QoS (Quality of Service) time period

Idempotent Design considerations

Platform events are published once, there is no retry on the Salesforce side. It is up to the ESB to request that the events be replayed. In a replay, the platform events reply ID remains the same and ESB can try to duplicate messages based on reply ID.

Unique Id for outbound messages is sent and needs to be tracked by the remote system to make sure duplicate messages or events are not repossessed.

Security

  1. Platform events – conforms to the security of the existing Salesforce org
  2. Outbound messages – One way SSL enabled, two way SSL can be used with outbound message certificate. Whitelist the IP ranges of the remote integration servers, remote server needs appropriate firewalls
  3. Apex callout – one way SSL enabled, 2 way SSL through self signed or CA signed certificates. Use one way hash or digital signature (Apex Crypto class) to ensure integrity of message. Remote system protected by appropriate firewall mechanisms.

Timeliness

Less important, control handed back to client immediate or after successful deliver message. Outbound message acknowledgment must occur in 24 hours (can be extended to seven days) otherwise message expires.

Platform events – send events to event bus and doesn’t wait for confirmation or acknowledgement from subscriber. If subscribe doesn’t pick up the message they can reply the event using replayID. High volume message are stored for 72 hours (3 days). Subscriber use CometD to subscribe to channel.

State management

Unique record identifiers are important for ongoing state tracking:

  1. Salesforce store remote system primary or unique surrogate key for remote record
  2. Remote system store Salesforce unique record ID or some unique surrogate key

Governor Limits

Limits depends on the type of outbound call and timing of the call

Reliable Messaging

  1. Platform Events – form of reliable messaging. Salesforce pushed the event to subscribers. If the message doesn’t gets picked up it can be replayed using reply ID.
  2. Apex callout – recommend that the remote system implement JMS, MQ however it doesn’t guarantee delivery to remote system, specific techniques such as processing positive acknowledgement from remote endpoint in addition to custom retry logic must be implemented.
  3. Outbound messaging – if no positive acknowledgment receive, retry up to 24 hours. Retry interval increase exponentially 15 sec to 60 min intervals.

Publisher and subscriber not in same transaction

Publish event before committed to the database, subscriber receives the event and does lookup to not find the record.

Publish behavior – publish immediately/publish after commit

Publish events – @future or @queueable to callout to events only when commit has complete

Message sequencing

Remote system discard message with duplicate message ID

Salesforce send RecordId, remote system makes callback to Salesforce

Handing Deletes

Salesforce workflow can’t track deletion of records, can’t call outbound message for deletion. Workaround:

  1. Create custom object called Deleted_Records__c
  2. Create trigger to store info (unique identifier) in custom object
  3. Implement workflow rule to initiate message based on the creation of custom object

Batch Data Synchronization

Best solutions

  1. Salesforce change data capture (Salesforce master)
  2. Replication via third party ETL Tool (Remote system master) – Bulk API
  3. Replication via third party ETL Tool (Salesforce master) – SOAP API getUpdated()

Suboptimal

  1. Remote call-in – call into SF, causes lots of traffic, error handling, locking
  2. Remote process invocation – call to remote system, causes traffic, error handing, locking

Extract and transform accounts, contacts, opportunities from current CRM to Salesforce (initial data load import)

Extract, transform, load billing data into Salesforce from remote system on weekly basis (ongoing)

Data master

Salesforce or Remote system

 

Salesforce Change Data Capture

Publish insert, update, delete, undelete events which represents changes to Salesforce. Receive near real time changes of records and sync to external data store.

Takes care of continuous synchronization part, needs integration app to receive events and perform update to external system

Channel – /data/{objectName}_Change

Error handling – Pattern is async remote system should handle message queuing, processing and error handling.

Recovery – initiate retry based on service quality of service requirement. Use replyID to reply stream of events. Use CometD to retrieve past messages up to 72 hours.

Bulk API – Replication via 3rd party ETL tool (more than 100 000 records)

Allow to run change data capture against source data. Tool reacts to change in the source data set, transforms the data and them call Salesforce Bulk API to issue DML statement, can also use SOAP API.

  1. Read control table to determine last time job ran, other control values needed
  2. Use above control values to filter query
  3. Apply predefine processing rules, validation rules, enrichments
  4. Use available connectors/transformation capability to create destination data set
  5. Write the data to Salesforce objects
  6. If processing is success update the control variable
  7. If process fail update control variable for process to restart

Consider

  1. Chain and sequence ETL jobs to provide cohesive process
  2. Use primary key for both systems to match incoming data
  3. Use specific API methods to extract only updated data
  4. Importing master-detail or lookup, consider using the parent key at the source to avoid locking. Group contacts for an account to be imported at the same time.
  5. Post-import processing should only process data selectively
  6. Disable Apex triggers, workflow and validation rules
  7. Use the defer calculations permission to defer sharing calculations until all data loaded

Error handling

If error occur during read operation, retry for errors. If errors repeat implement control tables/error tables in context of:

  1. Log the error
  2. Retry the read operation
  3. Terminate if successful
  4. Send a notification

Security

  1. Lightning Platform license with at least “API Only” user permission
  2. Standard encryption to keep password access secure
  3. Use HTTPS protocol

Timeliness

Take care to design the interface so all batch processes complete in a designated batch window

Loading batches during business hours not recommended

Global operations should run all batch processes at the same time

State management

Use surrogate key between two systems.

Standard optimistic locking occurs on platform and any updates made using the API require the user who is editing the record to initiate a refresh and initiate their transaction. Optimistic locking means:

  1. Salesforce doesn’t maintain state of record being edited
  2. Upon read, records time when data was extracted
  3. If user updated the record and before save checks if another user has updated
  4. System notified user update was made and use retrieve the latest version before updating

Middleware capabilities

Middleware tool that supports Bulk API

Supports the getUpdated() function – provide closest implementation to standard change data capture capability in Salesforce

Extracting data

Use the getUpdated() and getDeleted() SOAP API to sync an external system with Salesforce at intervals greater than 5 minutes. Use outbound messaging for more frequent syncing.

When querying can return more than an million results, consider the query capability of the Bulk API.

Remote Call-In

Best solutions

  1. SOAP API – Publish events, query data, CRUD. Synchronous API, waits until it receives a response. Generated WSDL – enterprise (strongly-typed), partner (loosely typed). Must have a valid login and obtain session to perform calls. Allows partial success if the records are marked with errors, also allows “all or nothing” behavior.
  2. REST API – Publish events, query data, CRUD. Synchronous API, waits until it receives a response. Lightweight and provides simple method for interacting with Salesforce. Must have a valid login and obtain session to perform calls. Allows partial success if the records are marked with errors, also allows “all or nothing” behavior. Output of one to use as input to next call.

Suboptimal

  1. Apex Web services – use when: full transaction support is required, custom logic needs to be applied before commenting.
  2. Apex REST services – lightweight implementation of REST services
  3. Bulk API – submitting a number of batches to query, update, upsert or delete large number of records.

Authentication

Salesforce supports SSL (Security Socket Layer) and TLS (Transport Later Security), Ciphers must be at least length 128 bits

Remote system has to authenticate before accessing any REST service. Remote system can use OAuth 2.0 or username/password. Client has to set the authorization HTTP header with the appropriate value.

Recommend client caches the session ID rather than creating a new session ID for every call.

Accessibility

Salesforce provides a REST API that remote system can use to:

  1. Query data in org
  2. Publish events to org
  3. CRUD of data
  4. Metadata

Synchronous API

After call to server it waits for a response, asynchronous call to Salesforce is not supported

REST vs SOAP

REST exposes resources as URI and uses HTTP verbs to define CRUD operations. Unlike SOAP, REST required no predefined contract, utilize XML and JSON for responses, and has loosely typed. Advantage includes ease of integration and great use for mobile and web apps.

Security

Client executing the REST needs a valid Salesforce login and obtain a access token. API respects the object and field level security for the logged in user

Transaction/Commit Behavior

By default every record is treated as a separate transaction and committed separately. Failure of one records does not cause rollback of other changes. Using the composite API makes a series of updates in one call.

Rest composite resource

Perform multiple operation in a single API call. Also use output of one call to be input of next call. All response bodies and HTTP statuses are returned in a single response body. The entire requires counts as single call towards API limit.

Bulk API

For bulk operations use Rest-based BULK API

Event driven architecture

Platform events are defined the same way as you define a Salesforce object. Publishing to event bus is same as inserting Salesforce record. Only create and insert is supported.

Error handling

All remote call-in methods or custom API require remote system to handle any subsequent errors such as timeouts and retries. Middleware can be used to provide logic for recovery and error handling

Recovery

A custom retry mechanism needs to be created if QoS requirements dictate it. Important to consider impotent design characteristic.

Timeliness

Session timeout – session timeout when no activity based on SF org session timeout

Query timeout – each query has a timeout limit of 120 seconds

Data volumes

CRUD – 200 records per time

Blob size – 2GB ContentVersion (Chatter)

Query – query(), queryMore() return 500 records, max 2000

State management

Salesforce stores remote system primary key or unique key for the remote record

The remote system stores the Salesforce ID unique record ID or some unique

Governor limits

5000 API calls per 24 hour

10 query cursors open at time

Reliable messaging

Resolve the issue that delivery of a message to remote system where the individual components may be unreliable. SOAP API and REST API are synchronous and don’t provide explicit support for any reliable messaging protocols.

Data visualization

Best solutions

  1. Salesforce Connect

Suboptimal

  1. 1. Request and reply – Salesforce web services APIs (SOAP or REST) to make ad-hoc data requests to access and update external system data

Access data from external sources along with Salesforce data. Pull data from legacy systems, SAP, Microsoft, Oracle in real time without making a copy of the data.

Salesforce connect maps data tables in external systems to external objects in your org. External objects are similar to custom objects, except they map to data located outside SF org. Uses live connection to external data to keep external objects up to date.

Salesforce connects lets you

  1. Query data in a external system
  2. CRUD data in external system
  3. Define relationships between external objects and standard or custom objects
  4. Enable Chatter feed on external object page for collaboration
  5. Run reports on external data

Salesforce Connect Adapters

OData adapter 2.0 or OData adapter 4.0connects data to exposed by any OData 2.0 or OData 4.0 producer

Cross-org adapter – connects to data that’s stored in another Salesforce org. Used the Lightning Platform REST API

Custom adapter created via Apex – develop own adapter with the Apex Connector framework

Calling mechanism

External Objects – maps SF external objects to data tables in external systems. Connect access the data on demand and in real time. Provides seamless integration with Lightning Platform can do global search, lookup relationships, record feeds.

Also available to Apex, SOSL, SOQL queries, Salesforce API, Metadata API, change sets and packages.

Error handling

Run Salesforce Connector Validator tool to run some common queries and notice error types and failure causes

Benefits

  1. Doesn’t consume data storage in SF
  2. Don’t have to worry about regular sync between systems
  3. Declarative setup can be setup quickly
  4. Users can access external data with same functionality
  5. Ability to do federated search
  6. Ability to run reports

Considerations

Impact reports performance

Security considerations

Adhere to Salesforce org-level security, use HTTPS connect to any remote system.

OData understand behaviors, limitations and recommendations for CSRF (Cross-Site Request Forgery)

Timeliness

Request invoked by user interface, should not keep the user waiting

May take long to relieve data from external system, SF configured 120 sec maximum timeout

Completion of remote process should execute in timely manner

Data volumes

Use mainly for small volume, real time activities, due to small timeout and maximum size of request or response for Apex call solution.

State management

Salesforce stores primary or unique surrogate key for the remote record

Remote system store SF unique record ID or other unique surrogate key

Apex Coding Interview Challenge #5

Given an encoded string, return its decoded string. The encoding rule is: k[encoded_string], where the encoded_string inside the square brackets is being repeated exactly k times. Note that k is guaranteed to be a positive integer.

s = "3[a]2[bc]", return "aaabcbc"
s = "3[a2[c]]", return "accaccacc"
s = "2[abc]3[cd]ef", return "abcabccdcdcdef"

DecodeHelper class

public class DecodeHelper {

    private Integer idx; 
    private String zero;

    public String decodeString(String s) {
        zero ='0';
        idx = 0;
        return decodeStringHelper(s);
    }
    
    private String decodeStringHelper(String s) {
     String ans = '';
     Integer repeat = 0;

     while (idx < s.length()) {
         Integer ch = s.charAt(idx);
         Integer[] intArr = new Integer[1];
         intArr[0] = ch;
         String chStr = String.fromCharArray(intArr);
         if (chStr == ']') {
             return ans;
         } else if (chStr == '[') {
             ++idx;
             String str = decodeStringHelper(s);
             while (repeat > 0) {
                 ans+=str;
                 --repeat;
             }
         } else if (chStr.isNumeric()) {
             repeat = repeat * 10 + ch - zero.charAt(0);
             
         } else {
             ans+=chStr;
         }
         ++idx;
     }
     return ans;
   }
}

DecodeHelper Test

@isTest class DecodeHelperTest {

    @isTest static void testDecodeString(){
        Test.startTest();
            DecodeHelper dh = new DecodeHelper();
            String decodedString1 = dh.decodeString('3[a]2[bc]');
            System.assertEquals(decodedString1, 'aaabcbc');
            String decodedString2 = dh.decodeString('3[a2[c]]');
            System.assertEquals(decodedString2, 'accaccacc');
            String decodedString3 = dh.decodeString('2[abc]3[cd]ef');
            System.assertEquals(decodedString3, 'abcabccdcdcdef');
        Test.stopTest();
    }
}

Apex passing parameters to Batch job during scheduling

Here it the use case:

You have 1 job that you want to run weekly and monthly but the monthly job also has to generate a notification. You don’t need to create two classes but can pass a parameter to the schedule to know it is a weekly or monthly job.

The batch Job class that accepts constructor parameter jobRunFrequency
global class AccountBatchJob implements System.Schedulable, Database.Batchable&lt;SObject&gt;, Database.Stateful, Database.AllowsCallouts {

	private static final String RUN_WEEKLY_JOB = 'RUN_WEEKLY_JOB';
	//This is the key that needs to be used to generate notifications for Monthly 
	private static final String RUN_MONTHLY_JOB = 'RUN_MONTHLY_JOB';
	private String jobRunFrequency;

	String query = 'Select Id, FirstName, LastName from Account';

	public AccountBatchJob(){
		this.jobRunFrequency = RUN_WEEKLY_JOB;
	}

	public AccountBatchJob(String jobRunFrequency){
		this.jobRunFrequency = jobRunFrequency;
	}

        global Database.QueryLocator start(Database.BatchableContext BC) {
		return Database.getQueryLocator(query);
	}

    global void execute(Database.BatchableContext BC, List&lt;Account&gt; accounts) {
     ....
     if (RUN_MONTHLY_JOB.equalsIgnoreCase(jobRunFrequency)){
     ......
     }
    }
}

Test class for batch Job passing parameter check if it is the Monthly job

Test.startTest();
	Database.executeBatch(new AccountBatchJob('RUN_MONTHLY_JOB'), 1);
Test.stopTest();

Test class Scheduler Job to check it if is the Monthly job

Test.startTest();
	String jobId = System.schedule('AccountBatchJob', '0 0 0 15 3 ? 2022', new AccountBatchJob('RUN_MONTHLY_JOB'));
	CronTrigger ct = [SELECT Id, CronExpression, TimesTriggered, NextFireTime FROM CronTrigger WHERE id = :jobId];
	System.assertEquals('0 0 0 15 3 ? 2022', ct.CronExpression);
Test.stopTest();

Apex Cache Tracker to remove session cache

Here is the problem that needed to be solved:
1. Use session cache to cache some user information like their profile or user settings
2. When a external systems update the information and try to remove the session cache as it got updated it fails because it does not have access to a users session cache

Solution:
Build a session tracker in Org Cache to track missed removal of session cache

Step 1: When a session cache remove fails add it to the Org level tracker cache. The format of session keys are the following:
key + userId (Salesforce userId)

That way I can keep track of what userId’s I need to remove session cache
String splitStrKey = key.substring(0, key.length()-18);
String splitUserId = key.substring(key.length()-18, key.length());

will split into key and userId

public Boolean remove(String key) {
      if (!System.isBatch() && !System.isFuture() && !Test.isRunningTest()){
        Boolean removed = false;
        if (sessionCacheEnabled && sessionPartition!=null) {
              System.debug('inside session cache remove key: ' + key);
              removed = sessionPartition.remove(key);
              if (!removed && key!=null && key.length() > 18){
                String splitStrKey = key.substring(0, key.length()-18);
                String splitUserId = key.substring(key.length()-18, key.length());
                System.debug('Session remove failed key ' + splitStrKey + ' for ' + splitUserId + ' adding to tracker');
                removeFailedAddToTracker(splitStrKey, splitUserId, null);
              }
          } else if (orgPartition!=null){
              removed = orgPartition.remove(key);
          }

        if (removed) {
          System.debug(LoggingLevel.DEBUG, 'Removed key ' + key);
          return true;
        } else{
          System.debug(LoggingLevel.DEBUG, 'Not removed key not found ' + key);
          return false;
        }
      } else {
        System.debug(LoggingLevel.DEBUG, 'Skipping cache remove in batch ' + key);
        return false;
      }
    }

Step 2: Add session keys to Org level tracker cache (CACHE_USER_CACHE_TRACKER), we will add it as Map<Id, Map> where Map contains UserId as key and Map of all the session keys that needs to be removed

public void removeFailedAddToTracker(String keyPrefix, Id userId, Set<Id> clientIds){
      if (App_Service.isIntegrationUser(userId)){
        Map<String, String> mapOfContactUserIds = (Map<String, String>)getOrgPartition().get(App_Rest_Constants.CACHE_USER_CONTACT_MAP);
        Map<Id, Map<String, Id>> mapOfExistingCacheQueueTracker = (Map<Id, Map<String, Id>>)getOrgPartition().get(App_Rest_Constants.CACHE_USER_CACHE_TRACKER);
        if (clientIds!=null && !clientIds.isEmpty()){
          for (Id clientId : clientIds){
            if (mapOfContactUserIds!=null && mapOfContactUserIds.containsKey(clientId)){
              Id userIdToClearCache = mapOfContactUserIds.get(clientId);

              if (mapOfExistingCacheQueueTracker!=null){
                if (mapOfExistingCacheQueueTracker.containsKey(userIdToClearCache)){
                  Map<String, Id> existingCacheStrings = mapOfExistingCacheQueueTracker.get(userIdToClearCache);
                  existingCacheStrings.put(keyPrefix, userIdToClearCache);
                  mapOfExistingCacheQueueTracker.put(userIdToClearCache, existingCacheStrings);
                } else {
                  mapOfExistingCacheQueueTracker.put(userIdToClearCache, new Map<String, Id>{keyPrefix=>userIdToClearCache});
                }
              } else {
                mapOfExistingCacheQueueTracker = new Map<Id, Map<String, Id>>();
                mapOfExistingCacheQueueTracker.put(userIdToClearCache, new Map<String, Id>{keyPrefix=>userIdToClearCache});
              }
            }
          }
        } else {
          if (mapOfExistingCacheQueueTracker!=null && mapOfExistingCacheQueueTracker.containsKey(userId)){
            Map<String, Id> existingMap = mapOfExistingCacheQueueTracker.get(userId);
            existingMap.put(keyPrefix, userId);
            mapOfExistingCacheQueueTracker.put(userId, existingMap);
          } else {
            if (mapOfExistingCacheQueueTracker==null){
              mapOfExistingCacheQueueTracker = new Map<Id, Map<String, Id>>();
            }
            mapOfExistingCacheQueueTracker.put(userId, new Map<String, Id>{keyPrefix=>userId});
          }
        }

        if (mapOfExistingCacheQueueTracker!=null)
          getOrgPartition().put(App_Constants.CACHE_USER_CACHE_TRACKER, mapOfExistingCacheQueueTracker);
      }
    }

Step 3: Every time before we check if session cache contains a key we will see if the same key is contained in the CACHE_USER_CACHE_TRACKER cache. If yes remove it from the users session cache so that cache can be removed and new query can be done

  public Boolean containsKey(String cacheKey){
      Boolean containsKey = false;
      if (!System.isBatch() && !System.isFuture() && !Test.isRunningTest()){
           if (sessionCacheEnabled==false && orgPartition!=null){
              if (orgPartition.get(cacheKey)!=null){
                containsKey = true;
              }
          } else if (sessionCacheEnabled && sessionPartition!=null) {
              if (sessionPartition.get(cacheKey)!=null){
                containsKey = true;

                Map<Id, Map<String, Id>> mapOfExistingCacheQueueTracker = (Map<Id, Map<String, Id>>)getOrgPartition().get(App_Rest_Constants.CACHE_USER_CACHE_TRACKER);
              	if (mapOfExistingCacheQueueTracker!=null && mapOfExistingCacheQueueTracker.containsKey(UserInfo.getUserId())){
            			Map<String, Id> flagsToClear = mapOfExistingCacheQueueTracker.get(UserInfo.getUserId());

                  Boolean removeFlag = false;
            			for (String flagToClear : flagsToClear.keySet()){
            				if (flagToClear.equalsIgnoreCase(cacheKey)){
                      String keyToRemove = flagToClear + flagsToClear.get(flagToClear);
            					Boolean removeItemFromCache = getSessionPartition().remove(keyToRemove);
                      
                      removeFlag = true;
                      containsKey = false;
            				}
            			}

                  if (removeFlag){
                    for (String flagToClear : flagsToClear.keySet()){
                      flagsToClear.remove(cacheKey);
                      if (flagsToClear.isEmpty() && flagsToClear.size()==0){
                        mapOfExistingCacheQueueTracker.remove(UserInfo.getUserId());
                      } else {
                        mapOfExistingCacheQueueTracker.put(UserInfo.getUserId(), flagsToClear);
                      }
                    }
                    
                    getOrgPartition().put(App_Rest_Constants.CACHE_USER_CACHE_TRACKER, mapOfExistingCacheQueueTracker);
                  }

            		}
              }
          }
       }
      return containsKey;
    }

Apex Coding Challenge Find Highest Frequency of Numbers

Problem: Find the number that has the highest frequency in a list of integers.

Input: 1,6,2,1,6,1

Output: 1 //because 1 occurs 3 times in the list

Option 1: Use Hashmap to iterate list

	List<Integer> nums = new List<Integer>{1,6,2,1,6,1};
		Map<Integer, Integer> numMap = new HashMap<>();
		for (Integer num : nums){
			if (numMap.containsKey(num)){
				Integer numFreq = numMap.get(num);
				numMap.put(num, numFreq+1);
			} else {
				numMap.put(num, 1);
			}
		}
		
		Integer biggestFreq = 0;
		Integer biggestVal = 0;
		for (Integer num : numMap.keySet()){
			if (numMap.get(num) > biggestFreq){
				biggestFreq = numMap.get(num);
				biggestVal = num;
			}
		}
		
		System.debug(biggestVal);

Option 2: Use wrapper class with compare to sort wrapper

List<Integer> nums = new List<Integer>{1,6,2,1,6,1};

Map<Integer, NumFrequencyWrapper> numMap = new HashMap<>();
for (Integer num : nums){
	if (numMap.containsKey(num)){
		NumFrequencyWrapper numFreqWrapper = numMap.get(num);
		numFreqWrapper.setFrequency(numFreqWrapper.getFrequency()+1);
		numMap.put(num, numFreqWrapper);
	} else {
		NumFrequencyWrapper numFrequencyWrapper = new NumFrequencyWrapper();
		numFrequencyWrapper.setNum(num);
		numFrequencyWrapper.setFrequency(1);
		numMap.put(num, numFrequencyWrapper);
	}
}
	
List<NumFrequencyWrapper> frequencyWrapperList = new List(numMap.values());
Collections.sort(frequencyWrapperList, new Untitled.NumFrequencyWrapperCompare());
System.debug(frequencyWrapperList.get(0).getNum());

public class NumFrequencyWrapper {
	private Integer num;
	private Integer frequency;
	
	public void setNum(Integer num){
		this.num = num;
	}
	
	public Integer getNum(){
		return num;
	}
	
	public void setFrequency(Integer frequency){
		this.frequency = frequency;
	}
	
	public Integer getFrequency(){
		return this.frequency;
	}
}
	
public class NumFrequencyWrapperCompare implements Comparator<NumFrequencyWrapper>{
	public int compare(NumFrequencyWrapper a, NumFrequencyWrapper b) { 
		return  b.getFrequency() - a.getFrequency(); 
	} 
}  

Option 3: Using buckets to group index of frequencies together

List<Integer> nums = new List<Integer>{1,6,2,1,6,1};
Integer returnNums = 2;
Map<Integer, Integer> numMap = new Map<Integer, Integer>();
for (Integer num : nums){
	if (numMap.containsKey(num)){
		Integer numFreq = numMap.get(num);
		numMap.put(num, numFreq+1);
	} else {
		numMap.put(num, 1);
	}
}

Map<Integer, List<Integer>> mapOfBucketWithValues = new Map<Integer, List<Integer>>();
for (Integer num : numMap.keySet()){
	Integer numFrequency = numMap.get(num);
	if (mapOfBucketWithValues.containsKey(numFrequency)){
		List<Integer> existingIndexNum = mapOfBucketWithValues.get(numFrequency);
		existingIndexNum.add(num);
		mapOfBucketWithValues.put(numFrequency, existingIndexNum);
	} else {
		List<Integer> numList = new ArrayList<>();
		numList.add(num);
		mapOfBucketWithValues.put(numFrequency, numList);
	}
}

for (Integer k=nums.size(), returnedNums=0; 1<=k; k--){
	if (mapOfBucketWithValues.containsKey(k)){
		for (Integer numBucket : mapOfBucketWithValues.get(k)){
			if (returnedNums < returnNums){
				System.debug(numBucket);
				returnedNums++;
			}
		}
	}	
}

Apex Clear all fields for a SObject record

The clearOutRecords would iterate all the fields passed as the currentRecord, then:
1. Exclude the fields as part of the fieldsToExcludeForClearOut Set and relationship fields
2. Check if the field is not null and updateable
3. Special logic to set fields to predefined values
4. Set all other fields to null
5. Return the SObject with fields as null


private static Set<String> fieldsToExcludeForClearOut = new Set<String>{'Cases', 'DoNotCall', 
'HasOptedOutOfFax', 'HasOptedOutOfEmail', 'LastName', 
'FirstName', 'Email', 'AccountId', 'CreatedDate',
'IsDeleted','Interval__c','OwnerId',
'OtherGeocodeAccuracy','MailingGeocodeAccuracy',
'BillingGeocodeAccuracy','ShippingGeocodeAccuracy'};

    public SObject clearOutRecords(SObject currentRecord, String sObjectName){
      SObjectType objToken = Schema.getGlobalDescribe().get(sObjectName);
      DescribeSObjectResult objDef = objToken.getDescribe();
      Map<String, SObjectField> fieldsSobject = objDef.fields.getMap();
      Map<String, Object> fields = currentRecord.getPopulatedFieldsAsMap();
      Type classType = Type.forName(sObjectName);
      SObject mergedRecord = (SObject)JSON.deserialize('{}', classType);
      for (String field : fields.keySet()){
        if (!fieldsToExcludeForClearOut.contains(field) && !field.contains('__r')){
          if (currentRecord.get(field)!=null && fieldsSobject.get(field).getDescribe().isUpdateable()){
            if ('User_Status__c'.equals(field)){
              mergedRecord.put(field, 'Incomplete');
            } else if ('Is_Mail_Same_As_Home__c'.equals(field)){
              mergedRecord.put(field, false);
            } else {
              mergedRecord.put(field, null);
            }
            } else if ('Id'.equals(field)){
            mergedRecord.put(field, currentRecord.get(field));
          }
        }
      }
      return mergedRecord;
    }

Initializing the clearOutRecords method

1. Query the fields that you would like to clear
2. Pass the Object to the clearOutRecords method

Contact queryContact = [Select Id, FirstName, LastName, Email, Birthdate, MailingState, Age__c from Contact where MailingState!=null limit 1 ];

Contact clearedOutContact = (Contact)App_Service.instance.clearOutRecords(queryContact, 'Contact');
        

Output

Contact:{Id=0036300000TQZIwAAP, Birthdate=null, MailingState=null}

Apex remove sensitive data from json

When you need to remove sensitive data from json before logging the following method will remove a predefined list of keywords

String bodyArgs = '{"name":"test", "ssn":"324234234", "email":"test@mail.com"}';
Object bodyObj = (Object)JSON.deserializeUntyped(bodyArgs);

Map<String, Object> mapObj = new Map<String, Object>();
if (bodyObj instanceof List<Object>){
	List<Object> lstObjs = (List<Object>)JSON.deserializeUntyped(bodyArgs);
    for (Object lstObj : lstObjs){
       Map<String,Object> parseLstObj = (Map<String,Object>)JSON.deserializeUntyped(JSON.serialize(lstObj));
       mapObj.putAll(parseLstObj);
    }
} else {
	mapObj = (Map<String,Object>)JSON.deserializeUntyped(bodyArgs);
}

Map<String, String> newMappedValues = new Map<String, String>
System.debug(removeAttributes(mapObj, newMappedValues));
>>> Output: '{"name":"test"}'

removeAttributes method will iterate all the keys in the payload and remove the sensitive keys from the payload

private Set<String> removeSensitiveKeyValue = new Set<String>{'ssn', 'email', 'dob'};

public Map<String, String> removeAttributes(Map<String,Object> jsonObj, Map<String, String> mappedKeys)  {
	for(String key : jsonObj.keySet()) {
		if (removeSensitiveKeyValue.contains(key)){
			jsonObj.remove(key);
		} else {
	      if(jsonObj.get(key) instanceof Map<String,Object>) {
	          removeAttributes((Map<String,Object>)jsonObj.get(key), mappedKeys);
	      } else if(jsonObj.get(key) instanceof List<Object>) {
	          for(Object listItem : (List<Object>)jsonObj.get(key)) {
	           if(listItem instanceof Map<String,Object>)  {
	        	removeAttributes((Map<String,Object>)listItem, mappedKeys);
	           }
	         }
	      } else {
			mappedKeys.put(key, String.valueOf(jsonObj.get(key)));
		  }
	  	}
	}
	return mappedKeys;
}

Apex Coding Challenge for Iterating Lists

Here is the problem statement:

Given an expectedSum value find the sum (n+ (n+1) = expectedValue) from a list that equals the expectedSum. I am using the following list

1,3,4,4,5,9

and looking for the sum to be 8 (3+5, 4+4)

*Note the list is sorted

Solution 1:


        Integer expectedSum = 8;
		List<Integer> listInts = Arrays.asList(1,3,4,4,5,9);
		
		for (Integer k=0; k<listInts.size();k++){
			for (Integer j=k+1; j<listInts.size();j++){
				if (listInts.get(k)+listInts.get(j)==expectedSum){
					System.debug(listInts.get(k) + ' + ' + listInts.get(j));
				}
			}
		}	

The above time complexity for a nested for loop is O(n^2)

Solution 2:

Do a binary search to search if the diff is contained in the list


		Integer expectedSum = 8;
		List<Integer> listInts = Arrays.asList(1,3,4,4,5,9);
		for (Integer listInt : listInts){ 
			Integer diffInt = expectedSum-listInt;
			if (binarySearch(listInts, diffInt)!=null){
				System.debug(listInt + " + " + diffInt);
			}
		}
	
	
	public static Integer binarySearch(List<Integer> listInts, Integer searchInt){
		Integer startPos = 0;
		Integer listSize = listInts.size() - 1;
		Integer mid;
		while(startPos <= listSize){
			mid=(startPos+listSize)/2;
			if(listInts.get(mid) == searchInt){
				 return listInts.get(mid);
			}     
			else if(searchInt < listInts.get(mid)){
				listSize = mid - 1;
			}    
			else{
				startPos = mid + 1;
			}
		}
		return null;	 
	}

the above time complexity for binary search is for each element in the array O(n log n) Returning null is not a good practice try to return another number or throw an exception.

Solution 3:

Check if the diff is contained in the list by using the .contains method


		Integer expectedSum = 8;
		List<Integer> listInts = Arrays.asList(1,3,4,4,5,9);

		for (Integer listInt : listInts){
			Integer diffInt =expectedSum-listInt;
			if (listInts.contains(diffInt)){
				System.debug(listInt + ' + ' + diffInt);
			}
		}

the above time complexity for loop is O(n)

Option 4

Start on either end of the array and move inwards when you find a solution, if the sum of the outer and inner element is bigger that the expectedSum move the maxPointer inwards, if it is smaller move the minPointer inwards.

        Integer expectedSum = 1;
		List<Integer> listInts = Arrays.asList(1,3,4,4,5,9);

		Integer maxPointer = listInts.size()-1;
		Integer minPointer = 0;
		for (Integer k=0; k<listInts.size();k++){
			if (expectedSum < listInts.get(maxPointer)){
				maxPointer-=1;
			} else if (minPointer!=maxPointer){
				Integer sumPair =listInts.get(minPointer) + listInts.get(maxPointer);
				if ( sumPair == expectedSum){
					System.debug(listInts.get(minPointer) + " + " + listInts.get(maxPointer));
					minPointer +=1;
					maxPointer-=1;
				} else if (sumPair < expectedSum){
					minPointer +=1;
				} else {
					maxPointer-=1;;
				}
			}
			
		}

the above time complexity for loop is O(n)

Output:

4 + 4
5 + 3
3 + 5

Bonus:

How to do it with an unsorted list:

Iterate through the list and add the integer to the list, if the diff of the expected sum and integer is contained in the set then print it

		Integer expectedSum = 8;
		List<Integer> listInts = Arrays.asList(7,4,6,1,5,2,3);

		Set<Integer> intSetWithDiff = new HashSet<>();
		for (Integer listInt : listInts){
			Integer sumDiff = expectedSum - listInt;
			
			if (intSetWithDiff.contains(sumDiff)){
				System.debug(listInt + " + " + sumDiff);
			}
			intSetWithDiff.add(listInt);		
		}

Output:

1 + 7
2 + 6
3 + 5