The Rhino Inquisitor https://www.rhino-inquisitor.com Get your insights on Salesforce Commerce Cloud B2C development! Tue, 02 Apr 2024 07:39:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.5 https://www.rhino-inquisitor.com/wp-content/uploads/2024/01/cropped-logo-wp-inquisitor-32x32.png The Rhino Inquisitor https://www.rhino-inquisitor.com 32 32 Getting to know the SFCC 24.4 Release https://www.rhino-inquisitor.com/getting-to-know-the-sfcc-24-4-release/ Mon, 01 Apr 2024 08:03:24 +0000 https://www.rhino-inquisitor.com/?p=11734 It’s that time of the year again! The April 2024 (24.4) release of Salesforce B2C Commerce Cloud is finally here, just in time for the spring season. Let’s take a closer look at all the exciting new features and improvements this release offers. Are you interested in last month’s release notes? Click here! Added support for additional HTTP […]

The post Getting to know the SFCC 24.4 Release appeared first on The Rhino Inquisitor.

]]>

It’s that time of the year again! The April 2024 (24.4) release of Salesforce B2C Commerce Cloud is finally here, just in time for the spring season. Let’s take a closer look at all the exciting new features and improvements this release offers.

Are you interested in last month’s release notes? Click here!

Added support for additional HTTP methods for Custom APIs

The newly enabled methods in 22.4 now allow us to create custom endpoints for any use-case (theoretically):

  • POST
  • PUT
  • PATCH
  • DELETE
  • HEAD
  • OPTIONS

This update is highly anticipated by Headless projects, as it offers greater flexibility in the Composable Storefront than ever before!

Read all about it here.

Rogue Query Timeouts in B2C Commerce

To protect customer instances and associated services from outages, B2C Commerce restricts rogue queries that produce 200 thousand results or more. To better protect Salesforce B2C Commerce, Salesforce plans to limit the allowed offset value from 200 thousand to 10 thousand..

Salesforce’s Commerce Cloud B2C platform has restricted rogue queries producing 200,000 or more results to protect customer instances from outages.

In the 24.6 Release in June 2024, the allowed offset value will be limited to 10,000. If a rogue query is generated, an error message will notify API users that the offset value needs to be set to under 200,000, while Business Manager calls will fail without the message.

To reduce the risk of generating large queries, users are advised to make their queries more targeted via filters and offset values. 

What batch APIs can be used to still retrieve large datasets?

				
					application failed to invoke [search_protocol.search()] on server, responding with 
            fault. exception was [com.demandware.core.rpc.server.InvocationException: exception 
            while invoking operation], cause [org.elasticsearch.ElasticsearchIllegalArgumentException: 
            Search request offset <value> is greater than offset limit 200000 for tenant '<GroupID>_<Instance'>, type 'order']
				
			

Manage More Images in Page Designer

A screenshot of the Image Picker in Page Designer showing 4 images available to select.

The Page Designer image and media picker now supports up to 1,000 images per folder. Previously, only 200 images within a folder were accessible in the image picker, even if more existed. The increased image limit improves the user experience for merchandisers and content creators and avoids workarounds, such as creating subfolders for extra images.

Since the introduction of Page Designer, users have been facing a recurring issue of images not being found due to the 200-image limit. This issue has been reported globally on a monthly basis.

The good news is that the limit has been increased to 1000. This should provide some relief to users while setting up their pages. It is still not unlimited, so it is important to manage folders effectively to avoid reaching the 1000 limit in the near future.

Business Manager​

Configure eCDN for Staging in Business Manager

A screenshot of the Staging Business Manager showing the link "Embedded CDN Settings" highlighted.

Business Manager now supports configuring eCDN for staging environments. eCDN settings are specific to each instance (development, staging, and production), and you manage them individually. When you create a proxy zone in production, the zone doesn’t replicate a corresponding proxy zone in your development or staging instance. The new eCDN configuration feature simplifies onboarding new sites for staging instances, making it easier to mimic your production instance. Because the configuration uses the existing CDN-API, you can use either Business Manager or the API to manage your eCDN configurations.

It feels like only yesterday that I published my blog post on how to upload certificates to the staging environment. With this latest release, we can now use the same user interface that we use in production and development. This is a great update that should simplify our lives, especially considering that in the past, we had to rely on support to get this done.

Auto-Correction is Disabled by Default

A screenshot of the Searchable Attributes in the Business Manager showing the new default setting for Autocorrection to be "No".

The default setting for auto-correction for searchable attributes added after the B2C Commerce 24.4 release is now set to No. This change affects searchable attributes added through the Business Manager UI or via import. Existing configurations aren’t affected. Previously, when you added a searchable attribute, the default setting was Yes, which could cause issues in instances when search functions shouldn’t correct values, such as the product SKU, ID, or ISBN. Additionally, the auto-correction dictionary’s size can incrementally increase over time, leading to search noise.

A small change has been made to remedy some confusion that can occur while configuring attributes. The Autocorrection feature is now a manual action that needs to be taken by the user, instead of being set by the system by default.

This is a nice change that will improve the overall user experience.

OCAPI & SCAPI

productSearch gets more data

With B2C Commerce 24.3, expanded the Shopper Search API productSearch endpoint to include additional parameters: productPromotions, imageGroups, priceRanges, and variants

Hooks have been a go-to patch for information like this for a while now, and being able to replace these customisations with platform-native solutions will help keep our projects maintainable.

Hook cleanup time!

New API: searchCustomerGroup

Search for customer groups in a given Site Id. The query attribute specifies a complex query that can be used to narrow down the search.

With each new release, a new API seems to appear—this time, one for the “management” side of things.

New API: Shopper Custom Objects API

Use the Shopper Custom Objects API to retrieve custom object information. You can specify an object type ID as well as a key attribute for the object.

Another use case where the OCAPI is no longer required, and we can access Custom Objects in our composable projects more easily.

Check for Customization with SCAPI

Two new SCAPI response headers are available to check for custom requests and resulting hook execution errors.

Two new headers have made their way into the SCAPI:

  1. sfdc_customization–indicates whether customization has been applied during the request execution. Currently, the only possible value for the header is “HOOK”, which indicates that a hook execution was registered.
  2. sfdc_customization_error–if the value is “1″, an error occurred within a hook execution.

This is a great addition that will allow us to get more information on the client side of what is going on and take some of the guesswork out of it.

OCAPI JWT Response to Updated Passwords Is Changed

To enhance security and align with the SLAS JWT session handling, we updated how the OCAPI JWT handles password changes. Now, if your customer changes their password, all previously issued active OCAPI JWTs are invalidated. The OCAPI client receives an HTTP 401 response, accompanied by a body message that indicates an invalid access token. Previously, the JWT remained valid until its normal timeout.

				
					“fault”: {
        “arguments”: {
        “accessToken”: “Customer credentials changed after token was issued. Please Login again.”
        },
        “type”: “InvalidAccessTokenException”,
        “message”: “The request is unauthorized, the access token is invalid.”
        }
				
			

Security is a serious matter, and automatic deactivation of active sessions is a valuable update that can give peace of mind. We can also use the information from the response to inform customers of what is happening.

Unfortunately, the error message does not have a unique identifier, only an “English” message, making translating or creating a key slightly more challenging.

Updated Cartridges & Tools

b2c-tools (v0.21.1)

b2c-tools is a CLI tool and library for data migrations, import/export, scripting and other tasks with SFCC B2C instances and administrative APIs (SCAPI, ODS, etc). It is intended to be complimentary to other tools such as sfcc-ci for development and CI/CD scenarios.

  • support parent traversal in page designer library
 

plugin_passwordlesslogin (v1.2.2)

Passwordless login is a way to verify a user’s identity without using a password. It offers protection against the most prevalent cyberattacks, such as phishing and brute-force password cracking. Passwordless login systems use authentication methods that are more secure than regular passwords, including magic links, one-time codes, registered devices or tokens, and biometrics.

The post Getting to know the SFCC 24.4 Release appeared first on The Rhino Inquisitor.

]]>
How to migrate passwords from Magento using Argon2 https://www.rhino-inquisitor.com/migrate-magento-passwords-using-argon2/ Wed, 27 Mar 2024 08:47:32 +0000 https://www.rhino-inquisitor.com/?p=11710 As a developer, you might encounter situations where you need to migrate data from one platform to another securely. This requires handling sensitive data like passwords with utmost care. In the case of Magento, password hashing is done using the Argon2 algorithm (depending on the Magento version, your mileage may vary), which is known for its security […]

The post How to migrate passwords from Magento using Argon2 appeared first on The Rhino Inquisitor.

]]>

As a developer, you might encounter situations where you need to migrate data from one platform to another securely. This requires handling sensitive data like passwords with utmost care. In the case of Magento, password hashing is done using the Argon2 algorithm (depending on the Magento version, your mileage may vary), which is known for its security and resistance against brute force attacks.

Now, if you’re migrating from Magento to Salesforce B2C Commerce Cloud, you need to make sure that the passwords are securely migrated as well. The bad news is that Salesforce B2C Commerce Cloud does not support the Argon2 algorithm out of the box for importing.

The good news is that I managed to migrate a Python script to Node.js that verifies the Magento password hash using the Argon2 algorithm. 

TL;DR; The script

I have created a script that can be used for various purposes. However, I would strongly suggest using it only for educational purposes or with proper authorisation. You can just change this script to meet your specific needs. For example, you can use it to develop a microservice for migrational purposes with proper authorisation.

hashes.txt

				
					ab5ebf8d273b085b6a60336198e0a5a2090fdc3e0606a678315c7274ab06e046:5PiKJRn28bBKoFMopMaaKuV47aJ6GzVg:3_32_2_67108864

				
			

wordlist.txt

				
					Password1
Password2
Password@
Password3
				
			

Script

				
					const argon2 = require('argon2');
const fs = require('fs');
const readline = require('readline');
const path = require('path');

const hashFilePath = path.join(__dirname, 'hashes.txt');
const wordlistFilePath = path.join(__dirname, 'wordlist.txt');

/**
 * Verifies the given hash string with the given password.
 *
 * @param {string} hashString - The hash string to verify.
 * @param {string} password - The password to verify the hash with.
 *
 * @returns {Promise<void>}
 */
async function verifyHash(hashString, password) {
    const split = hashString.trim().split(":");
    if (split.length !== 3) {
        console.log(`Invalid hash format: ${hashString}`);
        return;
    }

    let [hash, salt_b64, version] = split;

    if (version === "2" || version === "3") {
        version += "_32_2_67108864";
        hashString = `${hash}:${salt_b64}:${version}`;
    }

    const salt = Buffer.from(salt_b64.substring(0, 16));
    const versionInfo = version.split("_");

    if (versionInfo.length !== 4) {
        console.log(`Invalid version format: ${hashString}`);

        return;
    }

    const hashLength = parseInt(versionInfo[1]);
    const hashTimeCost = parseInt(versionInfo[2]);
    const hashMemory = parseInt(versionInfo[3]) / 1024;

    password = Buffer.from(password.trim());

    const result = await argon2.hash(password, {
        salt,
        type: argon2.argon2id,
        memoryCost: hashMemory,
        timeCost: hashTimeCost,
        parallelism: 1,
        hashLength: hashLength,
        raw: true
    });

    const hexHash = result.toString('hex');

    if (hexHash === hash) {
        console.log(`${hashString.trim()}:${password.toString()}`);
    }
}

/**
 * Processes each line of the file at the given file path.
 *
 * @param {string} filePath - The path to the file to process.
 * @param {Function} processLine - The function to process each line of the file.
 *
 * @returns {Promise<void>}
 */
async function processFile(filePath, processLine) {
    const fileStream = fs.createReadStream(filePath);

    const rl = readline.createInterface({
        input: fileStream,
        crlfDelay: Infinity
    });

    for await (const line of rl) {
        await processLine(line);
    }
}

/**
 * Processes each hash string in the hash file and verifies it with each password in the wordlist file.
 */
processFile(hashFilePath, async (hashString) => {
    await processFile(wordlistFilePath, async (password) => {
        await verifyHash(hashString, password);
    });
}).then(() => {
    console.log("Done");
});

				
			

Breaking It Down

Dependencies

The code relies on the argon2 library to hash passwords securely using the Argon2 algorithm, which should not come as a surprise. The downside is that this module is written in C, which makes it impossible to use in the back-end of Commerce Cloud.

It is possible to migrate (probably), but it would require a significant amount of effort.

It also uses the fs module for reading files, readline for processing lines, and path for handling file paths.

VerifyHash

This function takes a stored hash (hashString) and a candidate password (password).

It performs the following steps:

    • Parses the hash string into its components (hash, salt, and version).
    • Adjusts the version if needed.
    • Extracts the salt.
    • Retrieves information about hash length, time cost, and memory cost.
    • Computes a new hash using Argon2 with the same parameters.
    • Compares the computed hash with the stored hash.
    • If they match, it logs the hash and the original password.

Version Information

Luckily, Magento stores the required parameters for Argon2 in its version number, which can be extracted for our purposes:

hash:salt:3_32_2_67108864

Conclusion

Although Commerce Cloud does not support Argon2 by default, this workaround is available. You can create a Microservice in Node.js to enable frictionless migration for customers.

During login, you can call the service to verify the entered credentials and update the Commerce Cloud password with the available plain text password.

After the migration, you should have an attribute to turn off this service call for that account to avoid unnecessary calls.

The post How to migrate passwords from Magento using Argon2 appeared first on The Rhino Inquisitor.

]]>
In the ring: OCAPI versus SCAPI https://www.rhino-inquisitor.com/in-the-ring-ocapi-versus-scapi/ Mon, 18 Mar 2024 08:49:51 +0000 https://www.rhino-inquisitor.com/?p=11658 As we move into 2024, the SCAPI has received much attention and has been updated with new APIs, updates, and performance improvements. On the other hand, the OCAPI rarely gets any new features in its release notes, leading some to believe it is outdated or deprecated. In this article, I will explore this topic in […]

The post In the ring: OCAPI versus SCAPI appeared first on The Rhino Inquisitor.

]]>

As we move into 2024, the SCAPI has received much attention and has been updated with new APIs, updates, and performance improvements. On the other hand, the OCAPI rarely gets any new features in its release notes, leading some to believe it is outdated or deprecated.

In this article, I will explore this topic in detail to determine whether or not these claims are accurate. 

So, let’s get rumbling!

OCAPI versus SCAPI

Salesforce B2C Commerce Cloud has a long-standing history with its OCAPI, which offers a broad range of APIs for various purposes. One typical integration that highlights the functionality of these APIs is Newstore. This mobile application solution uses customisation hooks in the provided cartridge to integrate with the APIs.

The SCAPI, or Storefront Commerce API, is a relatively “new” set of APIs introduced on July 22, 2020. It offers a different way of interacting with SFCC (Salesforce Commerce Cloud) from third-party systems and headless front-ends than the way we had been doing with the OCAPI (Open Commerce API) before. 

However, there is one drawback to the SCAPI: not all APIs that exist in the OCAPI are available in the SCAPI, at least not yet.

Let’s keep score, shall we?

OCAPI: 1
SCAPI: 0

New APIs

In recent years, the SCAPI has introduced several new APIs that the OCAPI does not have. These new APIs have been implemented to address OCAPI gaps or expose new functionality, such as those related to SEO and CDN, allowing for more robust and comprehensive functionality.

SCAPI now offers a wide range of APIs for developers to use, allowing them to build customised solutions for their clients. As these new APIs have been developed explicitly for SCAPI, it is unlikely that the OCAPI will ever have access to them.

In the future, it is clear that any significant new APIs will only be added to the SCAPI, which aligns with the platform’s strategy.

OCAPI: 1
SCAPI: 1

SLAS

SLAS, or Shopper Login and API Access Service, is a Salesforce Commerce Cloud (SFCC) feature allowing third-party systems or headless front-ends to authenticate shoppers and make API calls.

It’s an authentication orchestration service that can handle various scenarios without requiring the creation of custom code for each one separately. (Some tweaking of parameters and configuration is still required, but that’s not the focus of this article.):

  • B2C Authentication: Normal login with Salesforce B2C Commerce Cloud
  • Social Login (Third-party login): Login with platforms such as Google and Facebook
  • Passwordless Login: Login via e-mail or SMS
  • Trusted Agent: Have a third-party person or system login on behalf of a customer

Although it is possible to use this service in conjunction with OCAPI, it is more part of the SCAPI offering, so let us give a point to SCAPI in this case.

OCAPI: 1
SCAPI: 2

PWA Kit

Have you heard about the PWA Kit or Composable Storefront? You may have, as it’s the latest addition to the front-end options besides SiteGenesis and SFRA.

The Composable Storefront is a Headless storefront that connects to the back-end SFCC systems through the SCAPI. Although it used to be connected to the OCAPI due to some limitations with the hooks system, the latest version is now fully connected to the SCAPI.

It’s no secret that the Composable Storefront is the primary driver for these innovations.

Another point to SCAPI!

OCAPI: 1
SCAPI: 3

Oh my … things aren’t looking proper for the OCAPI.

Infrastructure

The architectural setups of the OCAPI and SCAPI options are entirely different. 

The OCAPI runs on the back end, the exact location as the Business Manager, SFRA/SG storefront, and your custom code. 

On the other hand, the SCAPI is a MuleSoft instance managed by Salesforce (no, you can’t access this – but I know you want to).
In the current architecture, CloudFlare workers have taken over the role that was previously played by MuleSoft.

Although the SCAPI has an extra layer in between, it gives Salesforce the flexibility to make their architecture more flexible (and composable) by allowing them to have one point of entry while being able to upgrade, fix, or replace parts without anyone noticing.
However, this setup has some downsides, such as more network hops between the systems, resulting in network delays that need to be considered. By replacing MuleSoft with CloudFlare, the amount of network delays introduced should be minimal!

The OCAPI wins for its simplicity, but the SCAPI wins for its future-proof architecture. Nevertheless, this future-proof architecture can only work if it has been set up correctly, and we don’t have any view into that black box.

So, for me, both of them get a point here!

OCAPI: 2
SCAPI: 4

Rate Limits

APIs can be enjoyable to work with, but they are also vulnerable to DDoS attacks and poor design, leading to excessive API calls and a heavy server load. Yet, the OCAPI is designed to be safe and user-friendly, and CloudFlare and Salesforce-managed firewalls protect it to ensure server safety and limit the number of requests.

Although the rate-limiter is a straightforward “pass” or “block” method, it is essential to consider its impact and be prepared for the worst.

The SCAPI has implemented a new “Load Shedding” system to replace rate limits. This system provides a comprehensive view of what is happening behind the scenes.

OCAPI: 2
SCAPI: 5

Conclusion

The SCAPI outperforms the OCAPI in multiple ways, which is why the former was implemented. However, if you are still extensively using the OCAPI, there is no need to worry because you are not alone – even the SCAPI uses it behind the scenes. 

Many SCAPI API calls are just a proxy for OCAPI calls. Consequently, as long as the SCAPI depends on the OCAPI, it is not going anywhere.

The post In the ring: OCAPI versus SCAPI appeared first on The Rhino Inquisitor.

]]>
Reflecting on Two Years of Blogging: My Journey in the SFCC Ecosystem https://www.rhino-inquisitor.com/reflecting-on-2-years-of-blogging/ Mon, 11 Mar 2024 20:17:04 +0000 https://www.rhino-inquisitor.com/?p=11627 I am delighted to announce that I have completed two years of my journey with the Rhino Inquisitor blog. This journey has been a blend of difficulties and achievements. I am fortunate to have had the opportunity to impart my knowledge and understanding to the constantly expanding group of developers, architects, and enthusiasts who are […]

The post Reflecting on Two Years of Blogging: My Journey in the SFCC Ecosystem appeared first on The Rhino Inquisitor.

]]>

I am delighted to announce that I have completed two years of my journey with the Rhino Inquisitor blog. This journey has been a blend of difficulties and achievements.

I am fortunate to have had the opportunity to impart my knowledge and understanding to the constantly expanding group of developers, architects, and enthusiasts who are a part of the Salesforce B2C Commerce Cloud ecosystem. 

Hmm… that sounds like I’m quitting. Not to worry; it’s quite the opposite! I will continue on this journey, but let’s move on, shall we?

How it all started

When I started the blog, I primarily focused on providing technical articles about development, architectural diagrams, and community-related topics. I wanted to help developers and architects better understand Salesforce B2C Commerce Cloud and its capabilities. Over the past two years, I have written many articles, released every week, covering a wide range of topics. I have seen my readership grow as more people discover the value (at least, that’s what I hope) of the information provided on the blog. 

I have received numerous comments and feedback from readers who have found my articles helpful in their work.

Revising is necessary

A hand holding a red pen, marking words in red that need correction.

However, reflecting on the past two years, I realise it is about more than just the numbers. What matters most is my impact on the community and the knowledge I have provided to others. I have seen people take the knowledge and insights they have gained from my blog and apply them to create solutions and drive their projects forward.

As technology evolves, I recognise the importance of keeping my posts current. I am committed to revisiting my historical posts, updating them to match the current state of the platform and re-testing scenarios. 

I want to ensure everyone can access accurate and relevant information to help them achieve their goals.

The coming year

For the next year, I will continue to write articles, but at a slower pace, based on what I see happening in the ecosystem. My main focus will be to revise and update at least one article a week, ensuring that all the articles I have written provide accurate information to everyone who uses my blog.

I have also been given another opportunity to contribute to the community, which I cannot disclose much about. This new project will require a significant amount of time and effort, and if I continue at the pace I am currently maintaining with my blog, I will not be able to do justice to either of them. 

However, this new endeavour will be worth the wait.

Renewed as an MVP

I am delighted to share that I have been selected as a Salesforce MVP for another year. Words cannot express how grateful I am for this recognition. The past few years have been a remarkable journey, and being acknowledged as a Salesforce MVP means that I am on the right path and should continue working with the same dedication and commitment.

I want to thank everyone at ForwardSalesforce, and the community for their support and guidance. I could not have achieved this success without their constant encouragement and motivation. Thank you all for helping me reach where I am today. 

The list of names to thank would be too long, but you know who you are! THANK YOU!

Other things

It’s been a while since I’ve indulged in one of my favourite hobbies, but recently, I’ve rediscovered my love for tinkering. 

Before the pandemic hit in 2020, I used to spend my evenings attending cooking and electrician classes and experimenting with Arduino and Raspberry PIs to create home automation systems. I had to put this passion on hold for a while, but I’m excited to dive back into it and see what new creations I can come up with.

There’s something genuinely satisfying about taking a hands-on approach to learning and creating, and I’m looking forward to seeing where this rediscovered hobby takes me.

1601710443407

The post Reflecting on Two Years of Blogging: My Journey in the SFCC Ecosystem appeared first on The Rhino Inquisitor.

]]>
Digging into the B2C Commerce Cloud 24.3 release https://www.rhino-inquisitor.com/digging-into-the-b2c-commerce-cloud-24-3-release/ Mon, 04 Mar 2024 09:04:46 +0000 https://www.rhino-inquisitor.com/?p=11566 As snowy days slowly turn into sunny ones, the 24.3 release of SFCC has arrived! Let us have a look at the March release of 2024. Are you interested in last month’s release notes? Click here! Platform Add More Product Line Items per Basket If your site was limited to 50 line items per basket, the maximum […]

The post Digging into the B2C Commerce Cloud 24.3 release appeared first on The Rhino Inquisitor.

]]>

As snowy days slowly turn into sunny ones, the 24.3 release of SFCC has arrived! Let us have a look at the March release of 2024.

Are you interested in last month’s release notes? Click here!

Platform

Add More Product Line Items per Basket

A screenshot of the Business Manager basket preferences is displayed, featuring a new 200 default option for product line items.

If your site was limited to 50 line items per basket, the maximum number is increased to 200. This new limit doesn’t affect users who have been granted a lower or higher limit.

Quite a big uplift in the amount of different product line items allowed in a single basket by default. This will be particularly handy in certain industries such as groceries, gifting, and even some small B2B cases.

Prioritize Resource Bundle Lookup

You can now change the order of the resource bundle lookup and give priority to the WebDAV resource bundle. The default lookup first checks the resource bundle IDs of the code cartridges assigned to your site and then checks WebDAV. If you have resource bundles with the same ID in the cartridge and WebDAV, the cartridge resource bundle is always selected over the resource bundle in the WebDAV location. You can now use a toggle to switch the order to check WebDAV first.

A new feature toggle is now available under “Feature Switches” in the “Global” section of the Administration panel.

This option will give a bit more flexibility in translation management and open new routes. Does anyone care to revisit “Resource Manager”?

Scheduled Backups Button Is Disabled

The Scheduled Backups button is no longer available. Instead, use a customer job to schedule backups of your production and development environments.

The scheduled backup feature has been turned off on all PIG environments except for the staging environment. This feature was initially intended for the staging environment. So, most projects won’t be affected by this change.

However, if you use this feature in other environments, you can use a system job step called SiteExport that you can schedule independently.

Business Manager

Display Alert Messages in Business Manager

A screenshot of the Business Manager showing all different options on where to show certain notifications: Banner, Header, or Homepage

Display alerts as a persistent banner on the top of every Business Manager page. Alerts can relate to Business Manager modules and are only visible to users with permissions to access the module. Salesforce might also use the enhanced alerting framework to display critical system messages to Business Manager users.

A new option that is more prominent and cannot be ignored. To enable, go to Administration | Operations | Notification Settings in Business Manager and select the Banner alert type.

OCAPI & SCAPI

Prepare for Changes to Sever-Side Web-Tier Caching

If you provision your SCAPI zone with short code, SCAPI caching is enabled by default after March 12, 2024, and the feature switch SCAPI Server-Side Web-Tier Caching is has no effect. If you enroll in SCAPI before March 12, 2024, you can continue to enable SCAPI caching in Business Manager. To enable caching, in Business Manager, select Administration | Feature Switches, and turn on SCAPI Server-Side Web-Tier Caching.

Performance is always a hot topic for any industry, and having Web-Tier cache active will hopefully have a significant impact on the performance of all headless channels.

SCAPI - Shopper SEO

  • Updated getUrlMapping's response to include the optional property resourceSubType, which indicates whether the resolved object is a Page Designer content asset or a Content Slot asset. For more information, see the UrlMapping type reference.
  • Updated getUrlMapping to support URL redirects. For more information, see the URL Resolution guide.
  • Updated getUrlMapping to support these hooks: dw.shop.seo.url_mapping.beforeGET and dw.shop.seo.url_mapping.modifyGETResponse.

Some updates to the URL mapping endpoint, which include support for URL Redirection from the Business Manager!

Shopper Baskets v2

  • Provides support for temporary baskets. Temporary baskets can perform calculations to generate totals, line items, promotions, and item availability without affecting the shopper’s storefront cart. You can use these calculations for temporary basket checkout.
  • New Shopper Basket v2 response fields:
    1. groupedTaxItems
    2. taxRoundedAtGroup
    3. temporaryBasket

The new version for Shopper Baskets looks a bit different from v1, so adjust your customisations if you plan to upgrade.

SCAPI Load Shedding

  • If the system reaches a load threshold, an HTTP 503 response is returned for a subset of API families.
  • Covers APIs not covered by rate limits that are considered non-critical, for example: endpoints related to search, products, and authentication. Load shedding is not used for checkout-related endpoints, such as Shopper Baskets and Shopper Orders, to ensure that shoppers can complete an in-progress checkout.
  • Includes additional HTTP response headers that allow you to understand the current system load: sfdc_load, which represents a load percentage with higher percentages indicating higher loads, and sfdc_load_status, which is a enum WARN|THROTTLE that helps you understand the relative health of the system.

We received a notification regarding removing rate limits for SCAPI endpoints some time ago.

Instead, a new system called Load Shedding has been introduced. This system allows us to monitor the performance of the APIs based on different response headers that have been added. If necessary, we can also introduce our safety features.

While this system gives us more control, it also introduces a new scenario to take into account.

Custom Request Headers

Developers can send custom request headers that are passed and made available in server-side custom implementations.

It is now possible to add custom headers to your requests to use in your customisations on the server-side.

Pattern: c\_{yourHeader}

Account Manager

Security Update for the Audit History Logs

Starting with the 1.32.2 release, the full name and full email address for active audit history users are masked from administrators. Masking occurs when the active user isn’t part of the organization and they’re active in Audit History for a User, Organization, or API Client sessions. Masking is also used when organization users view their audit history on the start page. The mask improves security compliance.

A minor update for security compliance.

Updated Cartridges & Tools

plugin_passwordlesslogin (v1.2.1)

Passwordless login is a way to verify a user’s identity without using a password. It offers protection against the most prevalent cyberattacks, such as phishing and brute-force password cracking. Passwordless login systems use authentication methods that are more secure than regular passwords, including magic links, one-time codes, registered devices or tokens, and biometrics.

Update feature so that callbacks, redirects, and scopes are not overwritten.

composable-hybrid-sitegenesis-poc (v2.1.1)

This repository demonstrates a proof of concept (POC) for implementing SLAS and phased rollouts on SiteGenesis. The examples given use the latest version of SiteGenesis, using JavaScript controllers, but the same approach could be used on pipeline versions of SiteGenesis.

Although we are already at v2.1.1, this repository is new and its first release.

b2c-tools (v0.20.0)

b2c-tools is a CLI tool and library for data migrations, import/export, scripting and other tasks with SFCC B2C instances and administrative APIs (SCAPI, ODS, etc). It is intended to be complimentary to other tools such as sfcc-ci for development and CI/CD scenarios.

The post Digging into the B2C Commerce Cloud 24.3 release appeared first on The Rhino Inquisitor.

]]>
The journey from developer to architect https://www.rhino-inquisitor.com/the-journey-from-developer-to-architect/ https://www.rhino-inquisitor.com/the-journey-from-developer-to-architect/#respond Mon, 26 Feb 2024 06:36:00 +0000 https://www.rhino-inquisitor.com/?p=2104 So you want to be an architect, do you? At least, that is why I am guessing you came to this page! Well, you came to the right place! I have gone through this transition myself, and here are some topics I wanted to write about the role. What is an architect? But first things […]

The post The journey from developer to architect appeared first on The Rhino Inquisitor.

]]>

So you want to be an architect, do you? At least, that is why I am guessing you came to this page! Well, you came to the right place! I have gone through this transition myself, and here are some topics I wanted to write about the role.

What is an architect?

The Salesforce Architect Shirtforce shirt

But first things first, let us define what an architect is before we go any further. 

We will not be designing buildings. If that is why you are here, you came to the wrong blog!

Let us first dig into the different types of architects. Do you look surprised? Yes, there are multiple types, each of which has a different role in this world.

Solution Architect

solution architect

If we have to define a Solution Architect in a few sentences, it would be to evaluate all business requirements and develop the best solutions by proposing products and services. Solution Architects are given a business challenge/problem and are tasked with coming up with the answers to these questions.

Solution Architects keep a high-level overview of all the different systems and their capabilities to make the best decisions for a specific business question. After weighing all the pros and cons, they propose various possibilities to answer the question.

If we look at this role within Salesforce, solution architects understand all ( or a reasonable amount ) of Salesforce’s offerings to provide solutions using them. And, of course, if there is no product within the Salesforce ecosystem, look at third-party vendors to fill in the gaps.

Are you interested in B2C Solution Architecture? Then be sure to read the Salesforce B2C Solution Architect’s Handbook, authored by Mike King.

Enterprise Architect

A completely different type of architect than the Solution Architect. The main goal of an Enterprise Architect is to validate the solutions provided to the business and that they are aligned with the organisation’s strategy.

An Enterprise Architect has a top-level overview of the organisation regarding knowledge, capabilities, and potential.

I could write an elaborate article on this, but I will forward you an excellent article on Apex Hours.

Technical Architect

salesforce technical architect

The final one in this list is the Technical Architect, the most specialised one. This type of architect will take a single part or implementation of the big puzzle and focus on that. They will focus on this domain and gain in-depth knowledge that the Enterprise and Solution Architect lacks.

If we look at the Technical Architect certification within Salesforce, these Architects have a deep knowledge of everything related to the Salesforce CRM system. They know the ins and outs of all the different cogs that make that system work – and make them work for them, which is not an easy feat!

To learn more about it, you can go to Trailhead.

Putting them together

If we look at the different architects listed above, it is clear that they work as a team to get “things done.” Each has its specialisation and stakeholders to get projects towards the finish line.

I am a Technical B2C Commerce Cloud Architect moving to a Salesforce B2C Solution Architect if I look at myself. I have specialised knowledge of B2C Commerce Cloud platform and a high-level understanding of different clouds within and outside the Salesforce domain.

Using that knowledge, I look at business cases to develop solutions using Salesforce products (or third-party solutions), mainly connecting them with Salesforce B2C Commerce Cloud.

The journey

But I am getting ahead of myself; how does everything change if you go from being a developer to an architect? Well, a lot will change!

Development

One of the most significant changes going from developer to architect (or any lead position for that matter) is that the amount of time a day you spend writing code will go down dramatically.

And that is something you need to prepare yourself for. You will handle the theoretical more and guide others to implement the practical.

Meetings

A group of people in a meeting room, conversing with a group of people online.

One of the reasons you will be doing less development is because you will be in many more meetings than before. You will be gathering requirements from a client, internal discussions about architecture, follow-up meetings to answer questions you still have and many more reasons to have meetings.

People will also contact you more with questions about decisions that have to be made and help support the team during development (without developing yourself).

Responsibility

Do not take the title ‘architect’ lightly. You will lay the groundwork for the projects in any of the different types. If this is not done correctly, things can go awry rather quickly.

It is the same with, and yes, I will compare with buildings, civil architecture. If the groundworks are shabby, the building will topple over sooner or later. 

A large proportion of the success or failure of a project lies with the architect, so be prepared to shoulder it!

Make yourself heard

Get used to voicing your opinion toward your colleagues and your clients. Since you will be shouldering a large proportion of the responsibility, be sure that your views are heard and taken into account.

So, learn to be vocal and get used to talking to all types of people. You will be doing it a lot! And remember: it is OK to make mistakes, but take responsibility for them!

Documentation / Analysis

Instead of writing code, you will write text to support the team doing the implementation and the business.

Get used to writing a lot of documents and drawing diagrams! I decided to use Grammarly to help write English (as it is not my native language).

Don't reinvent the wheel

There is a lot of excellent documentation and help available on the web. Make use of it! If you have been in development for a long time, you might need some refreshers about diagramming and other things.

A good place to start is the Architect Help site of Salesforce!

In the end

The role of an architect is quite diverse if you consider the various “types”. However, it’s crucial to evaluate if you’re willing to make the necessary changes to your daily work routine.

If you have a passion for development and want to continue in that field, it might be best to delay pursuing a career as an architect for a few years. There is nothing wrong with growing within the developer role.

In the end, if every developer becomes an architect, there will be no one left to actually build anything.

The post The journey from developer to architect appeared first on The Rhino Inquisitor.

]]>
https://www.rhino-inquisitor.com/the-journey-from-developer-to-architect/feed/ 0
How to load client-side JavaScript and CSS in SFRA https://www.rhino-inquisitor.com/how-to-load-client-side-javascript-and-css-in-sfra/ Mon, 19 Feb 2024 08:28:48 +0000 https://www.rhino-inquisitor.com/?p=6585 Since you are here, I bet you’ve been banging your head against your keyboard trying to figure out how to load some sweet client-side javascript in Salesforce Commerce Cloud’s SFRA (Storefront Reference Architecture). Well, fear not, because I’m here to help (hopefully)! First, let’s ensure we’re on the same page. SFRA uses ISML (Internet Store […]

The post How to load client-side JavaScript and CSS in SFRA appeared first on The Rhino Inquisitor.

]]>

Since you are here, I bet you’ve been banging your head against your keyboard trying to figure out how to load some sweet client-side javascript in Salesforce Commerce Cloud’s SFRA (Storefront Reference Architecture). Well, fear not, because I’m here to help (hopefully)!

First, let’s ensure we’re on the same page. SFRA uses ISML (Internet Store Markup Language) for its templates and layouts, which means that to load in some javascript, we’ll need to use certain ISML tags and SFRA “features” to include it in our templates.

SFRA provides a helper

The Storefront Reference Architecture provides many features and “helpers” to make developers’ lives easier. One of those features is the “assets.js” file to load client-side JavaScript and CSS in a structured way.

				
					<isscript>
    var assets = require('*/cartridge/scripts/assets.js');
    assets.addCss('/css/account/my-file.css');
    assets.addJs('/js/my-file.js');
</isscript>
				
			

What is it?

In short: It is a “singleton” type class with two arrays that stores all CSS and JavaScript files that need to be loaded for the current page.

When does it load the files?

CSS

The “htmlHead.isml” template is loaded within the pages of your project. And within that template, you’ll find the following code:

This code is responsible for loading all the fancy styles that are present in that array we talked about earlier.

				
					<isloop items="${ require('*/cartridge/scripts/assets.js').styles }" var="style">
    <link rel="stylesheet" href="${style.src}" <isif condition="${style.integrity != null}">integrity="${style.integrity}" crossorigin="anonymous"</isif> />
</isloop>
				
			

JavaScript

Like styles, you can also load JavaScript files into your SFRA project using ISML. The main difference is that you’ll be using the scripts.isml template instead of htmlHead.isml. And if you want to see the big picture, you can check out the “page.isml” file, which is the highest-level ISML file used in SFRA.

				
					<script>//common/scripts.isml</script>
<script defer type="text/javascript" src="${URLUtils.staticURL('/js/main.js')}"></script>
<isloop items="${ require('*/cartridge/scripts/assets.js').scripts }" var="script">
    <isif condition="${script.integrity != null}">
        <script defer type="text/javascript" integrity="${script.integrity}" crossorigin="anonymous" src="${script.src}"></script>
    <iselse>
        <script defer type="text/javascript" src="${script.src}"></script>
    </isif>
</isloop>
				
			

It doesn't work! Why????

Remote Includes

If you’re using a remote include to render your component and loading the CSS and JS within that component with “assets.js”, you might have noticed that it doesn’t work. Here’s why:

When you make a remote include, it’s essentially a separate internal request. And here’s the thing about “assets.js” – it works in a singleton way – but only on the request level, meaning that variables are only stored per request and not for all requests. 

So when you add JS and CSS within the remote include, the main request doesn’t know about it because it’s being stored in a separate store. As a result, the added JS and CSS are not rendered on the page. 

Make sense?

Caching

This might seem like a no-brainer, but it’s worth mentioning: remember to clear your cache or disable it entirely if necessary (in your development environment).

Trust me; it can save you a lot of headaches.

A controller without "page.isml"

If you need to render a small component or a unique page type that isn’t like the central styling of SFRA, you might need to take matters into your own hands (re-create it or use a different system).

Without the ISML templates we mentioned earlier, there’s no way for the CSS and JS files to be rendered in HTML. Keep that in mind.

As a reference, here is how a controller template in SFRA is usually “decorated”, which includes our “assets.js” templates:

				
					<isdecorate template="common/layout/page">
    <!-- The template between the header and footer --->
</isdecorate>
				
			

Conclusion

Getting your JavaScript and CSS to appear in the HTML code is not a difficult task. However, a few crucial elements are required to make it happen in a structured way, which SFRA (Storefront Reference Architecture) provides. You should utilise the tools provided to you, as they can make your life just a tiny bit easier.

The post How to load client-side JavaScript and CSS in SFRA appeared first on The Rhino Inquisitor.

]]>
Variation Groups 101: The attribute fallback system in Commerce Cloud https://www.rhino-inquisitor.com/the-attribute-fallback-system-in-sfcc/ Mon, 12 Feb 2024 07:54:09 +0000 https://www.rhino-inquisitor.com/?p=11362 One of the features of B2C Commerce Cloud is the ability to create and use variation products, which share common attributes but differ in one or more aspects, such as colour, size, or style. Variation products can help merchants offer more choices to their customers and optimise their inventory management. Managing variant products can be […]

The post Variation Groups 101: The attribute fallback system in Commerce Cloud appeared first on The Rhino Inquisitor.

]]>

One of the features of B2C Commerce Cloud is the ability to create and use variation products, which share common attributes but differ in one or more aspects, such as colour, size, or style. Variation products can help merchants offer more choices to their customers and optimise their inventory management.

Managing variant products can be a challenging task, especially when it comes to defining and displaying the attributes of each variant. To simplify the process for merchants, integrations, and developers, a system has been implemented that prevents duplication of data at different levels (base, variation group, variant). In this article, we will explore this system and its advantages.

What are variation groups, and how do they differ from slicing?

When it comes to varying on an attribute, there are two options within Salesforce B2C Commerce Cloud. In a previous article, this concept has been explained in detail.

How does the attribute fallback system work for variation products, groups, and base products?

A visual representation of the Variation Group model by using a t-shirt. There colours of shirts, each with their own set of sizes with one base product at the top.
An attempt at visualising the fallback system.

The attribute fallback system is a mechanism that allows B2C Commerce Cloud to automatically retrieve the attribute values of variation products from other sources, such as variation groups or base products when they are not explicitly defined for the variant. This way, merchants can save time and effort in maintaining the attributes of various products and ensure that the customers see the correct and relevant information on the storefront.

The attribute fallback system works as follows:

  • When a customer views a variation product on the storefront, B2C Commerce Cloud first checks if the attribute value is defined for the variation product itself. For example, if the customer views a red shirt in size L, B2C Commerce Cloud first checks if the variant’s name and description are defined.
  • If the attribute value is not defined for the variation product, B2C Commerce Cloud then checks if the attribute value is defined for the variation group to which the variation product belongs. For example, if the red “large” shirt is part of a variation group “red shirt”, B2C Commerce Cloud checks if the name and description of the shirt are defined for the variation group.
  • If the attribute value is not defined for any of the variation groups, B2C Commerce Cloud then checks if the attribute value is defined for the base product that the variation product is derived from. For example, if the red large shirt is a variation of a generic “shirt”, B2C Commerce Cloud checks if the name and description are defined for the base product.
  • If the attribute value is not defined for the base product, B2C Commerce Cloud returns a default or empty value. For example, if none of the sources define the name of the red large shirt, B2C Commerce Cloud returns null for the variant.

Price

A variation group product detail page of a shirt where the variants have different prices, ending up with a "range".
A variation group with different prices for the variants

The attribute fallback system applies to all attributes of variation products except for the price attribute. The price attribute does not have a fallback from variation products to variation groups, as variation groups do not have prices. 

Instead, the price of a variation group is calculated as the range of the prices of the variation products within the group. 

For example, if a variation group for all red handbags contains three variants with prices of $130, $132, and $135, the price of the variation group is displayed as $130-$135 on the storefront.

What does it mean for development?

Luckily for the developers this system works seamlessly for developers and fetching attributes will automatically set some processes in the works behind the scenes just as the locale fallback.

  • dw.catalog.Variant class has attribute fallback behavior to first obtain attributes from (one or more) assigned variation groups and then from the base product.
  • dw.catalog.VariationGroup class has attribute fallback behavior to obtain attributes from the base product, when the attribute isn’t specified by the variation group.

Advantages for data import

Importing data can be time-consuming and resource-intensive, especially when dealing with large or complex data structures.

The attribute fallback system can reduce the amount of data needed to be imported, as merchants do not have to define the attribute values for every variation product or group.

This way, the merchants can save time and effort in preparing and validating the data and avoid duplicating or conflicting information across different data sources.

Import Speed

Reducing the amount of data in your XML files can lead to a significant decrease in file size. This, in turn, can result in faster imports and improved performance. 

By removing unnecessary duplicate elements, attributes, and content, you can streamline the import XML files, making them easier to process and leaves more room for other processes.

Many advantages

In conclusion, implementing a fallback system has many advantages, especially when it comes to keeping duplicate values away from your database.

It is possible that the fallback system may have some disadvantages. However, since this system is a part of Salesforce Commerce Cloud’s “black box,” we can only hope that any potential drawbacks have been addressed behind the scenes, so that we do not need to be concerned about them.

The post Variation Groups 101: The attribute fallback system in Commerce Cloud appeared first on The Rhino Inquisitor.

]]>
A look at the Salesforce B2C Commerce Cloud 24.2 release https://www.rhino-inquisitor.com/a-look-at-the-salesforce-b2c-commerce-cloud-24-2-release/ Mon, 05 Feb 2024 06:59:49 +0000 https://www.rhino-inquisitor.com/?p=11255 It’s time to gear up for the February 2024 (24.2) release of Salesforce B2C Commerce Cloud! With the arrival of this latest release, let’s look at what’s new and exciting!  You can always check out last month’s release notes by clicking here if you missed it. Platform Partitioned Cookies On By Default This new feature […]

The post A look at the Salesforce B2C Commerce Cloud 24.2 release appeared first on The Rhino Inquisitor.

]]>

It’s time to gear up for the February 2024 (24.2) release of Salesforce B2C Commerce Cloud! With the arrival of this latest release, let’s look at what’s new and exciting! 

You can always check out last month’s release notes by clicking here if you missed it.

Platform

Partitioned Cookies

Concerning browser vendors’ ongoing deprecation of third-party cookies, a new feature in Salesforce B2C Commerce Cloud affects how cookies are handled: “Partitioned Cookies

Cookies that are sent by controller code, such as via response.addHttpCookie will now have the Partitioned flag set. 

Most people won’t notice a difference, but this change may have side effects if you have more complex custom cookie logic. However, a feature toggle called Partitioned Cookies defaults to true – you can turn off this new behaviour there, but be sure to read the description carefully, as it is necessary for Page Designer and Toolkit when third-party cookies are disabled in the browser. 

Allow Duplicate Terms in Search Phrases

B2C Commerce search now accepts long-tail search phrases with duplicate search terms. When an exact match for a given phrase is found, the suggestion processor returns highly relevant search results.

Previously, duplicate words in long-tail search phrases were autocorrected to slightly different versions. Autocorrecting terms in the search phrase can result in irrelevant search results.

Improvements in the built-in search engine are always welcomed. A reliable search function across all channels enhances user experience and increases conversion rates.

Get a Higher Level of Time Stamp Accuracy with Inventory

ProductInventoryRecord.AllocationResetDate now supports a higher level of accuracy in the database. The change is backward-compatible.

When dealing with various sales channels that impact inventory, multiple stock modifications may occur within the exact second.

Previously, we were unable to differentiate at the millisecond level, but that is no longer the case! With the database modification that allows for milliseconds in the timestamp instead of seconds, we can now more effectively handle this particular use case.

XML Import example

				
					// example import with seconds
<allocation-timestamp>2023-11-22T06:56:01Z</allocation-timestamp>
 
// example import with milliseconds
<allocation-timestamp>2023-11-22T06:56:01.567Z</allocation-timestamp>
				
			

OCAPI

				
					// Request body example with millis
{ 'allocation': { 'amount': 17, 'reset_date': '2023-11-23T08:39:23.456Z' } }
// Response always with millis
.... 'reset_date': '2023-11-23T08:39:23.456Z'
 
// Request body example with seconds
{ 'allocation': { 'amount': 17, 'reset_date': '2023-11-23T08:41:23Z' } }
// Response always with millis
.... 'reset_date': '2023-11-23T08:41:23.000Z'
				
			

Business Manager

Specify a Date Format for Locales

You can now modify the date settings format in Business Manager so that the script API method dw.util.Calendar.getFirstDayOfWeek() returns the first day of the week in the date format used by your site locale. Previously, you couldn’t modify the date format to match the local or regional context.

A screenshot of the date settings in a locale before the 24.2 release.
Before
A screenshot of the date settings in a locale after the 24.2 release, showing the new start day of the week option.
After

Development

Custom SCAPI Endpoints

The promised helper functions have arrived with the new custom endpoints going GA, making creating scripts for your endpoints easier!

OCAPI & SCAPI

Custom SCAPI Endpoints

Although this option has existed since September last year, this feature is now officially out of BETA and can be safely used in production environments.

It is important to note that a few things have changed in this release that might break your current customisations, so verify that all endpoints still behave and work the way you intended.

The documentation has been updated to include these changes. It’s time to read up!

SLAS Updates

  • Improved error handling for TSOB(Trusted system on Behalf) for "customer not found" user scenarios.
  • Support added for using SAP Customer Data Cloud socialize REST endpoints.
  • IDP configuration now allows the IDP client credentials to be added to the POST body. SLAS now supports OIDC client_secret_basic and client_secret_post for client authentication.
  • Updated the /introspect endpoint to include a “sub” claim in the response.
  • Improved validation in Session Bridge(SESB) flow by checking for the customer_id and failing the request if the customer is already registered.
  • Includes SLAS Admin UI and API bug fix to address the cache synchronization issue when a client is edited or deleted.

SLAS updates this month include some critical changes. One of the issues that has been bothering me for the past year was the visual cache of the SLAS admin UI, which caused a lot of confusion by displaying outdated information. 

However, I’m happy to report that this issue has finally been fixed, dramatically improving UX.

Account Manager

1.32.0 Release

  • Security Fixes
  • Bug Fixes
  • Infrastructure Updates
  • UUID Tokens Switched to JWT Access Tokens: As previously announced in June 2023, Account Manager no longer supports the use of UUID token formats. All new API Clients only support the JWT access token format.

After quite a long warning beforehand, the UUID option is now wholly gone for new API clients!

SFRA v7.0.0

BREAKING CHANGE: SFRA v7.0.0 has been updated to support Node 18

  • Setup Github Actions config by @shethj in #1337
  • Allow arbitrary-length TLDs by @wjhsf in #1352
  • Fix broken locale selector on Page Designer pages. by @wjhsf in #1354
  • Fix search with multiple refinements on PLP by @shethj in #1365
  • Bug: avoid XSS attacks in addressBook.js by @mjuszczyk1 in #1366
  • Update: seo friendly urls for search refinements by @sandragolden in #1331
  • Bug: fix transformations not being applied (W-8851964) by @wjhsf in #1183
  • Use standard ignore for generated files by @wjhsf in #1182
  • Bump version to v7.0.0 by @shethj in #1373
  • Add node18 release note by @shethj in #1374

A long-awaited update to SFRA is finally here with the long-promised update to node 18! 

package.json changes in SFRA 7.0.0
Many libraries have been updated!

PWA Kit v3.4.0

General

  • Add support for node 20 #1612
  • Fix bug when running in an iframe #1629
  • Generate SSR source map with environment variable #1571
  • Display selected refinements on PLP, even if the selected refinement has no hits #1622
  • Added option to specify isLoginPage function to the withRegistration component. The default behavior is "all pages ending in /login". #1572

Accessibility
  • Add correct keyboard interaction behavior for variation attribute radio buttons #1587
  • Change radio refinements (for example, filtering by Price) from radio inputs to styled buttons #1605
  • Update search refinements ARIA labels to include "add/remove filter" #1607
  • Improve focus behavior on my account pages, address forms, and promo codes #1625

Storefront Preview
  • We've added a new context input field for Customer Group. This is a text input for now but we imagine a dropdown in the future.
  • We know many of you will bring third party CMS's to the mix. We want you to be able to use Storefront Preview with these as well! On that note please check out our new guidance on Preview extensibility. Essentially you can forward context changes onto a third party to set their version of context in the given platform meaning your Previewed storefront can faithfully render all the content relevant to your context settings.

With smaller and larger updates, the 3.4 release is now equipped to support more use cases and stay current with the latest Node versions!

Bugfixes

Someone decided in January to do some cleanup, making it harder to make an overview, but here is a link to make things easier!

Updated Cartridges & Tools

composable-storefront-pocs

This repo is a composable storefront implementation with various proof of concepts baked in. It otherwise closely tracks pwa-kit

A big update to the POC library, such as live editing (custom editors) and Promotional/Sale/list pricing on PLP and PDP.

plugin_slas (v7.2.0)

This cartridge extends authentication for guest users and registered shoppers using the Shopper Login and API Access Service (SLAS).

The post A look at the Salesforce B2C Commerce Cloud 24.2 release appeared first on The Rhino Inquisitor.

]]>
Understanding Locale Fallback in Salesforce B2C Commerce Cloud https://www.rhino-inquisitor.com/understanding-locale-fallback-in-sfcc/ Mon, 29 Jan 2024 09:06:32 +0000 https://www.rhino-inquisitor.com/?p=11181 In today’s digitally connected world, personalization and localization play a crucial role in delivering a tailored shopping experience. Salesforce B2C Commerce Cloud understands this and includes a powerful locale fallback mechanism to help businesses cater to various markets while managing content efficiently. In this deep-dive article, we will explore the locale fallback feature, its importance, […]

The post Understanding Locale Fallback in Salesforce B2C Commerce Cloud appeared first on The Rhino Inquisitor.

]]>

In today’s digitally connected world, personalization and localization play a crucial role in delivering a tailored shopping experience. Salesforce B2C Commerce Cloud understands this and includes a powerful locale fallback mechanism to help businesses cater to various markets while managing content efficiently. In this deep-dive article, we will explore the locale fallback feature, its importance, configuration, and potential considerations for developers working on international storefronts.

What is the Locale Fallback?

Locale fallback refers to the system’s ability to serve alternative content when localised data for a request is unavailable. On Salesforce B2C Commerce Cloud, the mechanism ensures that when dealing with multi-locale settings, the application server can source localisable attributes or properties from a predefined sequence of related locales. 

The default hierarchy handles locales by country first (e.g., en_US for United States English), then by language (en for English), and finally, referring to a default locale if necessary.

Importance of Locale Fallback

Locale fallback plays a critical role in maintaining a seamless user experience. Imagine a customer browsing an e-commerce website that lacks translation or specific data for their locale. Without fallback, this would lead to incomplete or inconsistent data, negatively impacting user experience and trust. With fallback, the store can still display relevant, albeit generic information, ensuring the site remains functional and informative.

How Locale Fallback Works

Locale Fallback explained with a decision tree going from en_US to en, and finally to default.
Is there a translation present?

Here’s an example to illustrate the concept:

  1. A shopper from the United States with the locale “en_US" visits a product page.
  2. The system first looks for product description, pricing, etc., data relevant to the “en_US" locale.
  3. If that specific locale data is unavailable, it falls back to “en” (indicating the English language).
  4. Should the en data also be missing, the system retrieves the default locale’s content.

This hierarchy ensures that the user receives readable and relevant content despite gaps in localised information.

Configuring Locale Fallback

A screenshot showing the locale config in 'Administration > Global Preferences > Locales'
Locales and fallback can be configured at "Administration > Global Preferences > Locales"

Salesforce B2C Commerce Cloud allows for customized fallback configurations. You can skip levels in the fallback chain or even eliminate fallback entirely for particular locales, depending on your specific requirements.

For the en_US example, the fallback chain by default is en_US > en > default. However, you could configure en_US to bypass the en step and go straight to default, or you might decide that en_US should not fallback at all.

Screenshot of the locale fallback for en-GB
The possible fallback options for en_GB

Things to Consider

  • Disabling Locale Fallback: You can disable fallback for individual locales. For instance, if the en locale’s fallback is disabled, and there’s no description for a product in the en dataset, then no description will be presented, unlike the usual fallback behavior where default text might be used.

  • Content Types Affected: The locale fallback mechanism applies primarily to subclasses of PersistentObject. This includes objects such as products but does not extend to ISML templates, web forms, resource files in cartridges, or static content such as images.

  • Restrictions: Configuring a locale as a fallback for another locale creates a dependency. Therefore, a locale that serves as a fallback cannot be deleted as long as another locale relies on it. This restriction ensures stability and consistency within your localizable content structure.

Developer Implications

Developers must carefully consider the implications of the fallback system when creating custom modules and localisable attributes. Aspects to keep in mind include:

  • Implementation of Fallback Logic: Developers need to incorporate logic that respects the fallback configurations when developing customisations involving localisable content. Generally, nothing needs to be done, but workarounds are required for some use cases.

  • Testing: Custom fallback configurations require thorough testing across different locales to ensure the expected behaviour and prevent content gaps.

Conclusion

The locale fallback mechanism in Salesforce B2C Commerce Cloud offers a powerful tool for businesses to effectively manage their international content strategy. By understanding and correctly configuring locale fallbacks, developers can ensure that their storefronts are localised and robust, providing a continuous flow of information across different regions and languages. 

As they advance in their Salesforce B2C journey, leveraging this feature will help create an inclusive shopping experience that resonates with a global audience.

The post Understanding Locale Fallback in Salesforce B2C Commerce Cloud appeared first on The Rhino Inquisitor.

]]>