top of page

Predicting Cattle Prices

Overview

Beef represents a $100 billion annual market at the US retail level. From cheeseburgers to bone-in rib eyes beef touches all economic strata and, in many cases, represents a revered aspect of the American diet. However, beef represents a luxury and alternative animal proteins, such as chicken, can be purchased far cheaper. Therefore, being able to predict future cattle prices means the ability to provide guidance to a wide swath of the economy including supermarkets, restaurants, cattle producers, agriculture equipment manufacturers, swine producers, avian producers, etc. In addition, by understanding future prices, cattle producers can optimize their financial performance and evaluate new niche markets including organic grass-fed beef.

​

BG 360° True View is the AI platform that serves as the basis for our systematic approach to capture the multifaceted nature of a commodity’s price and deliver forecasts.

Step 1 - Understand the Process Flow

  • What goes into the target variable (cattle prices)?

    • This will include the price of feed, medication, feed lot pricing, etc.

  • What does the target variable go into?

    • Restaurants, supermarkets, etc.

  • In this instance we devised a “Dirt to Dish” comprehensive model

Step 2 - Cast a Wide Net

With an unbiased mindset, we collect lots of data/information from the five all-encompassing domains in avariety of different formats.

  • Numbers based inputs

    • Stock prices, commodities futures, weather forecasts, …

  • Text based inputs (NLP and Text Analysis)

    • 10-Qs, 10-Ks, company conference calls, newspapers articles, trending topics, social media
      posts, …

  • Image based inputs

    • Satellite imagery, social media inputs, …

Step 3 - Store the Data

  • Clean and normalize the data (Data Engineering)

    • Quality control and checks are in place to look for missing values

  • Convert various data formats to standardized formats

    • Cadence of the data (hourly, daily, …)

    • Required maximum resolution

  • Sort into optimum storage options

Step 4 - Explore the Data

A preliminary analysis is executed to determine if tangential information should be included through the examination of key themes mainly from our collected text-based data. Assuming additional information is not necessary the data is then codified.

  • What does the data look like?

  • How is the data distributed?

    • Is it seasonal?

    • Are multi-year cycles present (El Nino)?

  • Are there any anomalies?

Step 5 - Model

Codification is a critical step as the data must be able to coexist in the same models to fully utilize all explanatory and predictive capabilities.

Following codification, we then run the data through our proprietary machine learning models for predictive and prescriptive analyses.

  • Based upon understanding the process flow in Step 1 and best scientific practices we develop models

  • We have a process to iterate through models to find the best fit from:

    • Simple linear models

    • Multi-GPU TensorFlow

    • Anything in between

Step 6 - Visualize

  • The greatest model is worthless if it cannot communicate information quickly and accurately

  • Prefer utilizing D3 to visualize with Django server side

  • For our clients we produced:

    • Time series visualizations showing commodity prices

    • Maps showing feed yields, cattle distribution by producer type, and various weather conditions/components

    • Probability distributions of target variables

The method we utilize is generalized to any product, commodity, or service. By combining sound scientific principles with cutting edge technology, we can ensure biases do not interfere and the most accurate answer is reached.

Outcomes

This is a paragraph area where you can add your own text. Just click “Edit Text” or double click here to add your own content and make changes to the font. It's a great place to tell a story about your business and let users know more about you.

bottom of page