Global Modeling with Automated ML: Impact of One Big Beautiful Bill on Big Tech

[This article was first published on DataGeeek, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Morgan Stanley analysts believe that the One Big Beautiful Bill will be a boon for Big Tech, as it will provide a cash influx to AI giants, thereby enhancing their dominance in future AI competitions.

But Trump’s 1 August tariffs, which compensate for tax cuts in the aforementioned bill, seemed not to benefit the tech firms, according to the chart below. Google and META look to be resilient compared to Amazon, likely their AD revenues.

Source code:

library(tidymodels)
library(tidyverse)
library(tidyquant)
library(timetk)
library(modeltime.h2o)

#Amazon
df_amazon <- 
  tq_get("AMZN") %>% 
  select(date, Amazon = close)

#META
df_meta <- 
  tq_get("META") %>% 
  select(date, META = close)

#Google
df_google <- 
  tq_get("GOOGL") %>% 
  select(date, Google = close)

#Merging the datsets
df_merged <- 
  df_amazon %>% 
  left_join(df_meta) %>% 
  left_join(df_google) %>% 
  drop_na() %>% 
  filter(date >= last(date) - months(12)) %>% 
  pivot_longer(-date,
               names_to = "id",
               values_to = "value") %>% 
  mutate(id = as_factor(id)) 
  
  
#Train/Test Splitting
splits <- 
  df_merged %>% 
  time_series_split(
    assess     = "15 days", 
    cumulative = TRUE
  )


#Recipe
recipe_spec <- 
  recipe(value ~ ., data = training(splits)) %>%
  step_timeseries_signature(date) 

train_tbl <- training(splits) %>% bake(prep(recipe_spec), .)
test_tbl  <- testing(splits) %>% bake(prep(recipe_spec), .)


#Initialize H2O
h2o.init(
  nthreads = -1,
  ip       = 'localhost',
  port     = 54321
)



#Model specificatiom and fitting
model_spec <- automl_reg(mode = 'regression') %>%
  set_engine(
    engine                     = 'h2o',
    max_runtime_secs           = 5, 
    max_runtime_secs_per_model = 3,
    max_models                 = 3,
    nfolds                     = 5,
    exclude_algos              = c("DeepLearning"),
    verbosity                  = NULL,
    seed                       = 98765
  ) 


model_fitted <- 
  model_spec %>%
  fit(value ~ ., data = train_tbl)

#Modeltime Table
model_tbl <- 
  modeltime_table(
  model_fitted
  )




#Calibrate by ID
calib_tbl <- 
  model_tbl %>%
  modeltime_calibrate(
    new_data = test_tbl, 
    id       = "id"
  )

#Measure Test Accuracy

#Global Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = FALSE) %>% 
  table_modeltime_accuracy(.interactive = FALSE)

#Local Accuracy
calib_tbl %>% 
  modeltime_accuracy(acc_by_id = TRUE) %>% 
  table_modeltime_accuracy(.interactive = TRUE)



#Prediction Intervals
calib_tbl %>%
  modeltime_forecast(
    new_data    = test_tbl,
    actual_data = df_merged %>% filter(date >= as.Date("2025-07-18")),
    conf_by_id  = TRUE
  ) %>%
  group_by(id) %>%
  plot_modeltime_forecast(
    .facet_ncol  = 2,
    .interactive = FALSE,
    .line_size = 1
  )  +
  labs(title = "Global Modeling with Automated ML", 
       subtitle = "<span style = 'color:dimgrey;'>Predictive Intervals</span> of <span style = 'color:red;'>GBM</span> Model", 
       y = "", x = "") + 
  scale_y_continuous(labels = scales::label_currency()) +
  scale_x_date(labels = scales::label_date("%b %d"),
               date_breaks = "4 days") +
  theme_tq(base_family = "Roboto Slab", base_size = 16) +
  theme(plot.subtitle = ggtext::element_markdown(face = "bold"),
        plot.title = element_text(face = "bold"),
        strip.text = element_text(face = "bold"),
        #axis.text.x = element_text(angle = 60, hjust = 1, vjust = 1),
        legend.position = "none")
To leave a comment for the author, please follow the link and comment on their blog: DataGeeek.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)