Based on the Dieselgate scandal we want to show you how you are able to fetch data in R, perform an Event Study, and do some basic plots with our R package.
library(tidyquant)
library(dplyr)
library(readr)
We use the package tidyquant to fetch the automotive stock data from Yahoo Finance. As we cannot get the total volume size from these companies through Yahoo Finance API, we do not perform a Volume Event Study in this vignette.
Let’s define the window from which we want to fetch the data of the German auto companies.
<- "2014-05-01"
startDate <- "2015-12-31" endDate
We focus us on the big five motor manufacturers in Germany, namely
# Firm Data
<- c("VOW.DE", "AUDVF", "PAH3.DE", "BMW.DE", "MBG.DE")
firmSymbols <- c("VW preferred", "Audi", "Porsche Automobil Hld", "BMW", "Daimler")
firmNames %>%
firmSymbols ::tq_get(from = startDate, to = endDate) %>%
tidyquant::mutate(date = format(date, "%d.%m.%Y")) -> firmData
dplyr::kable(head(firmData), pad=0) knitr
As a reference market, we choose the DAX.
# Index Data
<- c("^GDAXI")
indexSymbol <- c("^GDAXI")
indexName %>%
indexSymbol ::tq_get(from = startDate, to = endDate) %>%
tidyquant::mutate(date = format(date, "%d.%m.%Y")) -> indexData
dplyr::kable(head(indexData), pad=0) knitr
After we have fetched all the data, we prepare the data files for the API call, as described in the introductory vignette. We design in this step already the volume data for later purposes.
# Price files for firms and market
%>%
firmData ::select(symbol, date, adjusted) %>%
dplyr::write_delim(file = "02_firmDataPrice.csv",
readrdelim = ";",
col_names = F)
%>%
indexData ::select(symbol, date, adjusted) %>%
dplyr::write_delim(file = "03_marketDataPrice.csv",
readrdelim = ";",
col_names = F)
# Volume files for firms and market
%>%
firmData ::select(symbol, date, volume) %>%
dplyr::write_delim(file = "02_firmDataVolume.csv",
readrdelim = ";",
col_names = F)
%>%
indexData ::select(symbol, date, volume) %>%
dplyr::write_delim(file = "03_marketDataVolume.csv",
readrdelim = ";",
col_names = F)
Finally, we have to prepare the request file. The parameters for this Event Study are:
You can find details of the request file format in the introductory vignette.
<- c(rep("VW Group", 3), rep("Other", 2))
group <- cbind(c(1:5), firmSymbols, rep(indexName, 5), rep("18.09.2015", 5), group, rep(-10, 5), rep(10, 5), rep(-11, 5), rep(250, 5))
request %>%
request as.data.frame() %>%
::write_delim("01_requestFile.csv", delim = ";", col_names = F) readr
After the preparation steps, we are now able to start the calculations. We use the GARCH(1, 1) model in all types of Event Studies.
The following lines perform the Event Study using our generated data. Our package places the results per default in the results folder.
= Sys.getenv("event_study_api_key")
event_study_api_key
library(EventStudy)
<- EventStudyAPI$new()
est $authentication(apiKey = event_study_api_key)
est
# get & set parameters for abnormal return Event Study
# we use a garch model and csv as return
# Attention: fitting a GARCH(1, 1) model is compute intensive
<- EventStudy::ARCApplicationInput$new()
esaParams $setResultFileType("csv")
esaParams$setBenchmarkModel("garch")
esaParams
<- c("request_file" = "01_requestFile.csv",
dataFiles "firm_data" = "02_firmDataPrice.csv",
"market_data" = "03_marketDataPrice.csv")
# check data files, you can do it also in our R6 class
checkFiles(dataFiles)
# now let us perform the Event Study
$performEventStudy(estParams = esaParams,
estdataFiles = dataFiles,
downloadFiles = T)
After the analysis, you may continue your work in your toolset or parse the result files.
<- ResultParser$new()
estParser = estParser$get_analysis_report("results/analysis_report.csv")
analysis_report = estParser$get_ar("results/ar_results.csv", analysis_report)
ar_result = estParser$get_aar("results/aar_results.csv") aar_result
Now, you can use the downloaded CSV (or your preferred data format)
files in your analysis. While creating the arEventStudy
object, we merge information from the request and result files.
::kable(head(ar_result$ar_tbl)) knitr
The averaged abnormal return (aar) data.frame
has the
following shape:
::kable(head(aar_result$aar_tbl)) knitr
::kable(head(aar_result$statistics_tbl)) knitr
$plot_test_statistics() aar_result
$plot(ci_statistics = "Patell Z") aar_result
$plot_cumulative() aar_result
= Sys.getenv("event_study_api_key")
event_study_api_key <- EventStudyAPI$new()
est $authentication(apiKey = event_study_api_key)
est
# get & set parameters for abnormal return Event Study
<- EventStudy::AVyCApplicationInput$new()
esaParams $setResultFileType("csv")
esaParams
$performEventStudy(estParams = esaParams,
estdataFiles = dataFiles,
downloadFiles = T)
<- ResultParser$new()
estParser = estParser$get_analysis_report("results/analysis_report.csv")
analysis_report = estParser$get_ar("results/ar_results.csv", analysis_report)
ar_result = estParser$get_aar("results/aar_results.csv") aar_result
$plot_cumulative() aar_result
= Sys.getenv("event_study_api_key")
event_study_api_key <- EventStudyAPI$new()
est $authentication(apiKey = event_study_api_key)
est
# get & set parameters for abnormal return Event Study
<- EventStudy::AVyCApplicationInput$new()
esaParams $setResultFileType("csv")
esaParams
$performEventStudy(estParams = esaParams,
estdataFiles = dataFiles,
downloadFiles = T)