First of all, it makes sense to reflect on the goals achieved and the benefits gained so far – particularly in comparison to a paper-based process. These functional benefits are not related to the actual objective of the project, which was to test blockchain. Instead, they can be seen as by-products that were achieved along the way.
For example, there was no avoiding a full digitalisation of the process, which was necessary for the implementation of the pilot app. Even without further optimisation, the digitalised process would still work, in the same way as the paper-based process. This was something confirmed by project participants, who proved that there’s no need to laboriously fill out pallet notes any more, and that the process can work entirely with digital signatures. That’s a good start!
It also paves the way towards the first functional benefit: the balance between two exchange partners can be accessed at any time, rather than having to wait for the next pile of pallet notes to be entered. This means that decisions concerning pallet movement can be made at the loading bay, based on real-time data. When a company has a higher pallet balance, it can decide to keep hold of empty pallets to gradually even the situation out.
The data format was also standardised and tested during the pilot project. In practice, this is an area where hurdles and seemingly insurmountable barriers are often met. By using the GS1 EPCIS format (with minimal adjustments for blockchain) the team was able to meet a quick consensus. Several project participants also looked directly inside the blockchain and saw the data format used – everyone could understand the transactions even without the front end. This may be uncomfortable, but it underlines the openness of our solution and is the cornerstone that enables the data to be interpreted.
In our opinion, the analysis of the data represents the greatest potential of an entirely digital solution like this one. For example, it introduces the concept of circular swaps, which would allow more than two retail partners with differing balances to even out their pallet stock with one another. Currently, consideration of such scenarios is based on full data transparency, which we adopted for the pilot. However, there are alternative approaches that would also enable these scenarios, which we’ll come to later.
Last but not least, we successfully completed a blockchain project involving 17 partners, the success of which heavily depended on these partners working together closely on equal terms. Blockchain itself actually took a back-seat, just like it has in this post – and that’s a good thing! What matters to the end user is that the process runs smoothly. Blockchain is only a technology – in other words, it’s no more than a means to an end.
Firstly, it should be said that the resulting data is more than manageable. We can expect to be dealing with a volume of approximately 10MB per 1000 exchange transactions (plus some overheads and indexes). The transactions during the pilot project would equate to an annual volume of only around 500 MB of data per year.
This may not seem like much initially, but when it is scaled up to the number of network participants and a growing number of years, the volume of data created should not be underestimated. An extrapolation of the data from the load test (3600 exchanges per hour, 86,400 per day and just under 32 million per year) shows that data volumes quickly reach the gigabyte level – a total of 300GB per year, to be precise. Given this, combined with the basic principle that the data is immutable, we soon recognise that the location and life cycle of the data needs serious consideration.
One frequently discussed approach, which is gradually being introduced into a variety of enterprise-centric blockchain technologies, is the partial replication of data between companies involved in the process. This is being introduced as off-chain data in MultiChain 2.0, for example. In other words, this is where private communication channels between two nodes are described, which do not need to be published in a globally replicated chain.
This approach has benefits in terms of both data volume and latency of transactions because only a fingerprint of the data needs to be available to everyone at all times, with the actual data being replicated asynchronously later on. In practice, another benefit that this brings for our pallet consortium is the fact that expected data volume scales linearly with the number of a company’s own transactions. A multinational logistics company can definitely deal with terabytes’ worth of data, but it makes no sense for a nursery that only handles a few transactions a day or week.
An extrapolation among some companies that took part in the test confirms this assumption and makes it evident that implementing the correct data replication concept can significantly increase the acceptance of the technology.
Despite possible optimisations using partial data replication, the lifecycle also needs to be given some thought at some point. Once a blockchain, always a blockchain. Blocks build on top of one other and are validated back to the first – or genesis – block. Old data cannot simply be deleted or archived.
The easiest approach, and the one that is the most promising, is simple. Like a full notebook or an overflowing order book, the pages eventually run out. Now, this isn’t necessarily the case with a technical solution, but the solution remains the same. The way to solve this issue is a simple transfer of the closing balance into the opening balance of a new stream, thereby closing off the old stream. A copy of the data in the immutable stream can then be archived or stored using traditional tools.
The system that was successfully tested during the pilot project is now integrating the digitalisation of the debt to be carried forward, which is currently included in the information on the pallet note. The difference in number of pallets exchanged creates a debt that needs to be settled at some point, either in pallets or money. However, this functionality doesn’t specifically need a blockchain, because the debt carried forward is only recognised with the signature of both exchange partners. If there’s no signature, the debt cannot be acknowledged. Today, blockchain is primarily used to ensure error-free replication for both parties and serve as a mutual source of evidence.
Alternatively, a digital cryptographic currency (e.g. pallet coin) could be used, either as an advanced means of carrying debt forward (where coins can be freely generated) or as an actual representation of the monetary value of the pallets (which would require a centralised distribution of coins). Further details about this topic can be found in the blog post about <link innovation blockchain-blog der-gs1-palettencoin>native assets.
As mentioned above with regard to the balance calculation, it is a good idea to take a closer look: Does this situation really need blockchain technology to solve it? Well, yes and no.
Firstly, why not? It doesn’t need it, because the biggest added value in the process is digitalisation. Another reason why not is because trust can also be established in this situation by both parties signing a record. Furthermore, because even circular swaps would probably work if companies periodically published their own balances.
However, these points disregard one of the key reasons for using an enterprise blockchain – decentralised partners working together with the same information. In most situations, this can only be achieved by designing an alternative process, which has been agreed by all participants. In concrete terms, this means finding an intermediary who can provide a pallet exchange platform where partners, customers and even competitors can publish all their data relating to the pallet exchange process. As explained in the ‘Data visibility’ section, this is perhaps easier said than done.
So, why does this situation call for blockchain? It does so because it allows me to validate my assumed reality without a central authority. It also creates extremely simple methods for tracing transactions (using consortium identities, see below). Furthermore, it provides a suitable framework for defining and enforcing rules together as a consortium.
As far as I am concerned, the real question that needs answering is not whether blockchain-based or blockchain-influenced technology should be adopted, but which one will provide the right framework for a pallet exchange consortium. This question needs to be answered because creating a pallet exchange consortium on its own does not change current practice significantly enough to achieve greater collaboration between partners on equal terms.
Running a node costs money. Since we need to run a certain number of nodes for the blockchain to work properly, we need to have sufficient intrinsic motivation (data transparency, validation of transactions) or else another incentive must be found. This issue is currently being considered with regard to the formation of the consortium because if billing models are not found at this level, technical incentives will be needed.
During the pilot phase, only two admin nodes in the MultiChain network had mining rights (allowing them to add new blocks to the blockchain). Thinking beyond the end of the pilot phase, it will be crucial to define the necessary size of the core network, along with the corresponding mining and admin rights. It also follows that we need to think about the lifecycle of the blockchain (for example, a major version upgrade). Furthermore, admin rights and obligations still need to be defined (for example, how many mining nodes are needed online, etc.). These rights also lead on to establishing consortium-controlled obligations (e.g. node uptime). In a public blockchain, these issues are normally solved using incentives (e.g. block rewards), while in a private set-up, they need to be solved using contractual agreements.
The end-of-project questionnaire revealed that many project participants expressed their objection to full data transparency, as currently the case in the project. This controversial point crops up in most blockchain projects and had, therefore, already been discussed during the solution design phase. Nevertheless, we initially decided to go with full transparency, so that we could demonstrate the potential added value of data visibility.
First of all, it should be noted that the decentralised nature of blockchain technology opens up the opportunity to think about data visibility beyond peer-to-peer integration. In the case of a centrally operated network, the intermediary maintains full visibility of all data. Having this as an option, therefore, is a benefit in terms of autonomy and protection of sensitive data.
One possibility of restricting transparency is using private side-chain/off-chain data as mentioned above, which is only shared between the parties involved and only anchored to the global transaction chain when necessary. Depending on the desired and permitted level of transparency, circular swap scenarios would still be possible. A cyclical publication of balances would simply be required in this case. In other words, individual transactions can still remain ‘secret’ between the parties involved.
The field of process optimisation is as wide as it is varied. Digitalisation usually paves the way towards changes and optimisations, not all of which make sense, and these must always be weighed up against the tangible benefits that they bring. The pilot project created a technical basis for recording pallet movements. However, this only enables pallet exchanges to be recorded quantitatively, and not qualitatively.
One interesting opportunity is to serialise pallets using an RFID chip or code, which would shift our focus from any pallet to a specific pallet. This would enable us to track the actual movement of individual pallets, which, in turn, would enable optimisations such as a reduction in the number of empty runs, more efficient exchange procedures, etc.
In conclusion, it should be noted that the technology itself – blockchain or, more specifically, MultiChain – did not give rise to any critical discussions or situations during any phase of the project. Instead, it was the controversial discussions about the process design, the data models, procedures and interaction channels (frontend) that sometimes made us wonder whether the project would be a success. It is therefore important to have a holistic, comprehensive view of the process as a whole, rather than just the technology. Once the necessary framework conditions have been made clear, the technology is just peripheral.