Everything in logistics moves fast in today’s world. Not only do we move faster and do more each day, the pace of innovation around the tools that we use in this industry are accelerating with each passing day. Technology drives the job that we do today into the job that we will be doing tomorrow. The rapid move towards digitization and the value of data have never been at a more frenetic pace for the logistics space. As things continue to accelerate, it’s hard to keep up with all of the technology that is available and pushing the boundaries of what we do every day.
One thing is certain; there are specific technologies that are driving the current pace of change. When these technologies are understood and properly applied to business, they can be incredibly powerful. However, there are plenty of examples of these technologies being misunderstood, misapplied to the solution, or just plain incorrectly used. The following are four commonly used tech terms when talking about the digitization of the logistics business and my take on their relationship to the industry.
“Big Data” could be one of the most commonly used phrases of the past few years in the logistics industry. It’s true that we have more sources of data today than we did a decade ago, thanks in part to the number of advancing technologies being employed. As a result, the logistics business has experienced a data explosion over the past few years, and many are struggling with what to do with all of this data. While there are certainly places for the use of the term “big data,” it is not universal or all encompassing to any data set. There is no all encompassing definition for how much data actually makes it “big,” the general feel is that the definition connects to there being so much information in a given data set that it cannot be processed by traditional computing standards (i.e. via a desktop or laptop computer). Another better way to put it is: If you open the data and feel the overwhelming sense of it being too much to manipulate quickly, or draw conclusions from on your own, then it’s “Big Data.” The key part of this is recognizing that there is a need for a higher level of analysis, and a data expert generally best handles that. The saying “paralysis by analysis” comes to mind when some of these larger data sets are utilized, and often it’s valuable to have something to help you work through assumptions, gaps, and inevitably, conclusions.
API by definition is Application Programming Interface. API connection development is one of the fastest growing aspects of business-to-business interfaces that are being created today in the logistics marketplace. The concept of businesses connecting with their partners via a systemic and integrated process is not new to the logistics industry. Over the past thirty years, Electronic Data Interchange (EDI) has dominated the core integration process. API’s are slowly taking over market share from EDI, especially as companies upgrade their software or add vendors capable of API interfacing. The value of API to the logistics space relates to the way in which you can build flexible and adaptable connections with partners. Moreover, these connections that send or ask for data to be sent can be done so by request, instead of through a rigid process like EDI. While API does require a specific skill set to develop the integration similar to EDI, it is far more scalable once it has been set up, which makes it worth the development effort. Also, there are various degrees of API connection that can be developed, with simpler connections merely mimicking a login to a website, and deeper connections interfacing with systems directly. EDI is something that has worked very well over the years, the rigidity of the connection, the structured timing of communication, and the burden on both parties from a development perspective are allowing for new technology like API to begin to gain large-scale adoption.
Machine Learning is another technology that continues to see expanded use connected to software used and developed for the logistics space. Machine learning is a relatively new technology that has come onto the scene as computers have become more powerful, affordable, and thereby scalable. As the processing power cost has dropped, both hosted and virtually, computers have begun to be able to consume massive amounts of data and provide meaningful results. The core tenet of machine learning is that the computer is literally learning how to analyze the data on its own as it is analyzing the data. A computer programmer will develop an algorithm, a complicated formula for analyzing data, and apply that algorithm to a set of data. This data can be a fixed set of data, but is typically something that is accreting more data all the time. This machine-learning algorithm will essentially look for patterns in the data that help it to constantly adapt to the data while looking for a certain type of conclusion from the data.
Pretty much every person who uses the Internet has used a machine-learning algorithm – the Google search engine. Google’s search engine has been using a revolutionary algorithm for more than a decade that constantly learns the best way to provide results for searches by using the previous searches to predict the value of results for the user searching currently. Machine learning is best used when paired with other technologies or software as an enabler. It is also arguably one of the most powerful tools available to developers for getting meaningful ongoing value of out of data, and will continue to be one of the most widely adopted technologies powering data driven decision making in logistics.
The technical term for these tools is actually “internet bot,” but it has been shortened to bot as it has gone mainstream. Bots are some of the newest technology around, and are small programs that are developed for unique solutions to web based data entry problems. Bots are software programs that are written to do automated tasks that are repetitive; their value is they do these tasks at a very high rate.
Bots have received notoriety for their utilization in the stock market connected with automated trading that now makes up a massive chunk of the daily trading in global markets. In the logistics space, bots are being used to manage the high volume of data entry tasks that have become the norm for the industry. Much of this relates to the acceleration of systems and processes many service providers are instituting to provide transparency. A good example of this is the extra step of data validation that has become a standard when working with a 4th Party Logistics provider (4PL) who is managing all or a portion of a shipper’s freight network. These 4PLs typically requires quite a bit of information to be entered on their website as a way to provide scale and affordability related to the transparency they are providing to the shipping process. This is almost always information that is also being entered into the Transportation Management System (TMS) of the business that is actually arranging for the movement of the. In many cases today, businesses are using people to manually copy data from one system to another to comply with the requirements set forth by the 4PL. For the most part, this is a non-revenue generating activity and one that will become unsustainable over time as demand for this level of data transparency accelerates. Enter the Bots; they are being used to automatically “enter” this data into these systems via programs being written to specifically manage this task. While this technology is currently fairly expensive, as demand increases and more providers enter this space, the cost of adoption should drop dramatically, making it something that many will be able to employ for their businesses’ efficiency. Bots certainly have the potential to reshape the way in which we are sharing information from one system to another, reducing the need for the overhead associated with manually entering that information.
The breakneck pace of change the logistics industry is seeing today is incredibly hard to keep up with. Not only are there new technologies being adopted; there are also new service providers and competitors that are entering the marketplace with a solid command over these technologies. The ability to firmly grasp each of these technologies along with other emerging technologies like Artificial Intelligence, Blockchain, and Augmented Reality are going to be the key to successfully navigating the future marketplace we are moving towards. While many of them will not be required to survive, adopting a few of these for select aspects of your business could certainly help you gain an advantage and help to position you well for the future.
Joining NFI in 2012, David is in charge of NFI’s North American brokerage, transportation management, Intermodal, and drayage businesses. David is leading this rapidly expanding division by offering a more robust suite of services to new and existing NFI clients.