Machine learning - giving systems the ability to learn and improve automatically - is a concept that has excited technologists massively over the last decade. And one aspect in particular, deep learning, has been attracting a lot of attention.
Inspired by the structure of the human brain, the ideas of deep learning are being used to predict, highly precisely, how trends will develop. We look at the problem of time series to explain how deep learning thinking is being used.
Introduction of the time domain to data
Usually while dealing with machine learning problems, a vast amount of data is involved. This data is either from your database or the publicly available datasets. Similarly, time series problems also involves dealing with huge data. So what distinguishes these two from each other?.
The answer is the time dimension.
When dealing with any data, each feature is treated as a separate dimension because of the different ways it gets represented when visualised. By including the time dimension while processing data, we collect it in a timely order which results in us finding a trend that can tell us a lot about it.
Take a look at the stock price data of Amazon and Alibaba for the period of March’17 to March’18 to see what we mean.
Although the stock prices were nearly the same at the beginning, a drastic difference can be noticed over time. Alibaba’s stock grew significantly during July, while Amazon’s dropped a little. However, it rose after November and ended up exceeding the stock prices of Alibaba by the end of March’18. It’s only by seeing the data over time that patterns emerge, otherwise we’d just have a random list of numbers.
At some stage we, as individuals would have worked out how to create patterns and make sense of them by looking for them as a time series, but how does a machine realise that? We can either tell the machine exactly what to think or we can teach it to work it out for itself.
Equipping a machine with the ability to think.
Deep Learning mainly consists of using artificial intelligence to mimic how the brain works to process data and create patterns from it to use in decision making. The process consists of an artificial neural net which comprises of several layers arranged in an order. Each layer is responsible for computing a set of operations on the input data and passes it on to the next level. This process continues until enough iterations are performed, and a large amount of data is gathered. We obtain an algorithm that can detect a pattern from this data. This pattern can then be used to make predictions of the future outcomes.
To explain the concept of deep learning better, the VGGNet architecture, which is the most popular deep learning model used to classify images will be an ideal example.
The architecture consists of 13 convolutional Layers along with four Max-pooling layers that are combined at the end with three fully connected layers. It also consists of a soft-max activation layer which is used to calculate the probabilities at the end to detect classified images. A colour image with dimensions 244X244X3 is given as an input to the architecture.
At every layer, certain convolution operations are performed on the image, which results in the detection of certain features. This process goes on to create a network so deep that it ends up dealing with almost 138 million trainable parameters in a single loop.
Deep Learning at COGOPORT
We’re currently focused on logistics and supply chains, dealing with multiple shipments daily. We get to see a lot of problems in how shipping works. We decided to implement deep learning in a few areas using Recurrent Neural Networks (RNNs) to see if we could find new ways of solving them. This field is a little modification of the already existing algorithms and takes the pattern of how the past trends were periodically before making a prediction.
Predicting the number of containers that would be booked with us in the coming weeks is useful to help us ensure we can satisfy customers and to alert our suppliers to potential demand. So to develop our demand forecasting we used a 6-layered custom-designed LSTM architecture, which is a well known model of RNNs, along with the data of the containers booked up to the previous week to make a prediction. It’s a methodology that has given us a reliable idea of how many containers are about to be booked over over our trading platform.
We have also been using deep learning at COGOPORT to predict how the cost of containers will fluctuate. In fact, even when we don’t know the rates being charged in some areas we have been accurate in anticipating what they might be in future.
Our customers have also benefited by our ability to work out which shipping lines might be running a service on a particular route even if our system only generates a single option for a particular pair of calling points. It means that we can offer customers a choice of rates even though only one price is initially found in the database.
The possibilities in the shipping industry are massive which makes our sector such a good place to try out new approaches. Shipping is complicated in so many ways and technology puts insights into the hands of small businesses. And it means we’re growing because more and more shippers are trusting our platform to reduce their costs and simplify their businesses.