This vignette provides a quick tour of the R package.
library("rtweet")
First you should set up your own credentials, this should be done just once ever:
auth_setup_default()
Which will look up your account on your browser and create a token and save it as default. From now on, on this R session on others we can use this authentication with:
auth_as("default")
Automatically rtweet will use that token in all the API queries it will do in the session.
If you want to set up a bot or collect a lot of information, please
read the vignette("auth", "rtweet")
.
You can search tweets:
## search for 18000 tweets using the rstats hashtag
<- search_tweets("#rstats", n = 100, include_rts = FALSE)
rstats colnames(rstats)
#> [1] "created_at" "id"
#> [3] "id_str" "text"
#> [5] "full_text" "truncated"
#> [7] "entities" "source"
#> [9] "in_reply_to_status_id" "in_reply_to_status_id_str"
#> [11] "in_reply_to_user_id" "in_reply_to_user_id_str"
#> [13] "in_reply_to_screen_name" "geo"
#> [15] "coordinates" "place"
#> [17] "contributors" "is_quote_status"
#> [19] "retweet_count" "favorite_count"
#> [21] "favorited" "favorited_by"
#> [23] "retweeted" "scopes"
#> [25] "lang" "possibly_sensitive"
#> [27] "display_text_width" "display_text_range"
#> [29] "retweeted_status" "quoted_status"
#> [31] "quoted_status_id" "quoted_status_id_str"
#> [33] "quoted_status_permalink" "quote_count"
#> [35] "timestamp_ms" "reply_count"
#> [37] "filter_level" "metadata"
#> [39] "query" "withheld_scope"
#> [41] "withheld_copyright" "withheld_in_countries"
#> [43] "possibly_sensitive_appealable"
1:5, c("created_at", "text", "id_str")]
rstats[#> Users data at users_data()
#> # A tibble: 5 × 3
#> created_at text id_str
#> <dttm> <chr> <chr>
#> 1 2022-12-20 07:00:00 "Practical Linear Algebra for #DataScience! #BigData #Analytics #Mat… 16050…
#> 2 2022-12-19 03:00:17 "#SQL Interview for #DataScientists! #BigData #Analytics #DataScienc… 16046…
#> 3 2022-12-20 00:26:01 "LSTM: Intro to Long-Short Term Memory. #BigData #Analytics #DataSci… 16049…
#> 4 2022-12-20 22:02:11 "CRAN updates: archiveRetriever #rstats" 16053…
#> 5 2022-12-20 21:37:06 "#Python Libraries and Frameworks via @DataScienceDojo! 😄🙃😊\n\n#1… 16053…
The include_rts = FALSE
excludes retweets from the
search.
Twitter rate limits the number of calls to the endpoints you can do.
See rate_limit()
and the rate limit
section below. If your query requires more calls like the example
below, simply set retryonratelimit = TRUE
and rtweet will
wait for rate limit resets for you.
## search for 250,000 tweets containing the word data
<- search_tweets("peace", n = 250000, retryonratelimit = TRUE) tweets_peace
Search by geo-location, for example tweets in the English language sent from the United States.
# search for tweets sent from the US
# lookup_coords requires Google maps API key for maps outside usa, canada and world
<- search_tweets("lang:en", geocode = lookup_coords("usa"), n = 100)
geo_tweets 1:5, c("created_at", "text", "id_str", "lang", "place")]
geo_tweets[#> Users data at users_data()
#> # A tibble: 5 × 5
#> created_at text id_str lang place
#> <dttm> <chr> <chr> <chr> <list>
#> 1 2022-12-20 13:19:26 Love being your Christmas Danny. https://t.co/XdR7Pt90at 16051… en <df>
#> 2 2022-12-20 22:05:45 Pedestrian Accident in #Charlotte on University City Bl… 16053… en <df>
#> 3 2022-12-20 22:05:45 Accident. Right Shoulder blocked in #Centreville on I-6… 16053… en <df>
#> 4 2022-12-20 22:05:45 Always having to pee is getting really annoying 😭😭 16053… en <df>
#> 5 NA <NA> <NA> <NA> <NULL>
You can check the location of these tweets with
lat_lng()
. Or quickly visualize frequency of tweets over
time using ts_plot()
(if ggplot2
is
installed).
## plot time series of tweets
ts_plot(rstats) +
theme_minimal() +
theme(plot.title = element_text(face = "bold")) +
labs(
x = NULL, y = NULL,
title = "Frequency of #rstats Twitter statuses from past 9 days",
subtitle = "Twitter status (tweet) counts aggregated using three-hour intervals",
caption = "Source: Data collected from Twitter's REST API via rtweet"
)
plot of chunk plot1
You can post tweets with:
post_tweet(paste0("My first tweet with #rtweet #rstats at ", Sys.time()))
#> Your tweet has been posted!
It can include media and alt text:
<- tempfile(fileext = ".png")
path_file png(filename = path_file)
plot(mpg ~ cyl, mtcars, col = gear, pch = gear)
dev.off()
#> png
#> 2
post_tweet("my first tweet with #rtweet with media #rstats", media = path_file, media_alt_text = "Plot of mtcars dataset, showing cyl vs mpg colored by gear. The lower cyl the higher the mpg is.")
#> Your tweet has been posted!
You can also reply to a previous tweet, retweet and provide additional information.
Retrieve a list of all the accounts a user follows.
## get user IDs of accounts followed by R Foundation
<- get_friends("_R_Foundation")
R_foundation_fds
R_foundation_fds#> # A tibble: 30 × 2
#> from_id to_id
#> <chr> <chr>
#> 1 _R_Foundation 1448728978370535426
#> 2 _R_Foundation 889777924991307778
#> 3 _R_Foundation 1300656590
#> 4 _R_Foundation 1280779280579022848
#> 5 _R_Foundation 1229418786085888001
#> 6 _R_Foundation 1197874989367779328
#> 7 _R_Foundation 1102763906714554368
#> 8 _R_Foundation 1560929287
#> 9 _R_Foundation 46782674
#> 10 _R_Foundation 16284661
#> # … with 20 more rows
Using get_friends()
we can retrieve which users are
being followed by the R Foundation.
If you really want all the users that follow the account we can use
get_followers()
:
<- get_followers("_R_Foundation", n = 30000,
R_foundation_flw retryonratelimit = TRUE)
#> Downloading multiple pages ======================>----------------------------------------------
#> Downloading multiple pages =================================>-----------------------------------
#> Downloading multiple pages =============================================>-----------------------
#> Downloading multiple pages =========================================================>-----------
Note that the retryonratelimit
option is intended for
when you need more queries than provided by Twitter on a given period.
You might want to check with rate_limit()
how many does it
provide for the endpoints you are using. If exceeded
retryonratelimit
waits till the there are more calls
available and then resumes the query.
As seen above we can use lookup_users()
to check
their
# Look who is following R Foundation
<- lookup_users(R_foundation_fds$to_id, verbose = FALSE)
R_foundation_fds_data c("name", "screen_name", "created_at")]
R_foundation_fds_data[, #> Tweets data at tweets_data()# A tibble: 30 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 R Contributors R_Contributors 2021-10-14 21:15:12
#> 2 Sebastian Meyer bastistician 2017-07-25 11:22:43
#> 3 Naras b_naras 2013-03-25 19:48:12
#> 4 useR! 2022 _useRconf 2020-07-08 10:22:55
#> 5 useR2021zrh useR2021zrh 2020-02-17 15:54:39
#> 6 useR2020muc useR2020muc 2019-11-22 14:50:55
#> 7 useR! 2020 useR2020stl 2019-03-05 03:52:58
#> 8 Roger Bivand @rsbivand@fosstodon.org RogerBivand 2013-07-01 18:19:42
#> 9 Henrik Bengtsson henrikbengtsson 2009-06-13 02:11:14
#> 10 Gabriela de Queiroz ☁️ 🥑 (fosstodon.org/@kroz) gdequeiroz 2008-09-14 18:55:29
#> # … with 20 more rows
# Look 100 R Foundation followers
<- lookup_users(head(R_foundation_flw$from_id, 100), verbose = FALSE)
R_foundation_flw_data 1:5, c("name", "screen_name", "created_at")]
R_foundation_flw_data[#> Tweets data at tweets_data()# A tibble: 5 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 N Santos ninofss 2022-06-29 20:13:46
#> 2 KR KuriaShiroh 2017-01-13 21:02:59
#> 3 Burak Karahan _BurakKARAHAN 2013-06-25 14:20:58
#> 4 Fabrice Humura Fhumura 2013-06-20 09:23:07
#> 5 VictorAnsem VictorAnsem 2013-11-18 11:03:54
We have now the information from those followed by the R Foundation and its followers. We can retrieve their latest tweets from these users:
tweets_data(R_foundation_fds_data)[, c("created_at", "text")]
#> # A tibble: 30 × 2
#> created_at text
#> <chr> <chr>
#> 1 Thu Dec 01 11:46:14 +0000 2022 "Today! EMEA/APAC office hour has just taken place. AMER office…
#> 2 Wed Sep 28 15:06:22 +0000 2022 "RT @pdalgd: #rstats 4.2.2 \"Innocent and Trusting\" scheduled…
#> 3 Tue Nov 29 15:55:04 +0000 2022 "@stevenstrogatz Terry Tao also posted a nice note recently htt…
#> 4 Sat Nov 12 16:06:40 +0000 2022 "RT @HeathrTurnr: Wondering about setting up a data science/ope…
#> 5 Fri Jul 29 05:30:06 +0000 2022 "RT @kidssindwichtig: Wenn du in deiner Arbeitszeit noch nie ei…
#> 6 Fri Apr 16 11:03:21 +0000 2021 "RT @_useRconf: It is a good time to remember some of our keyda…
#> 7 Mon Jan 18 17:36:22 +0000 2021 "Give us a follow at @_useRconf to stay updated on *all* future…
#> 8 Sat Dec 17 13:04:43 +0000 2022 "RT @VincentAB: Want to use @jmwooldridge's Extended Two-Way Fi…
#> 9 Mon Dec 19 15:14:15 +0000 2022 "@_ColinFay @rweekly_org That was quick. Thx"
#> 10 Mon Dec 19 18:29:22 +0000 2022 "@thomas_mock @posit_pbc OMG! Congratulations!! (Sorry to tell …
#> # … with 20 more rows
Search for 1,000 users with the rstats hashtag in their profile bios.
## search for users with #rstats in their profiles
<- search_users("#rstats", n = 100, verbose = FALSE)
useRs c("name", "screen_name", "created_at")]
useRs[, #> Tweets data at tweets_data()# A tibble: 100 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 Rstats rstatstweet 2018-06-27 05:45:02
#> 2 R for Data Science rstats4ds 2018-12-18 13:55:25
#> 3 FC rSTATS FC_rstats 2018-02-08 21:03:08
#> 4 #RStats Question A Day data_question 2019-10-21 19:15:24
#> 5 R Tweets rstats_tweets 2020-09-17 18:12:09
#> 6 Ramiro Bentes NbaInRstats 2019-11-05 03:44:32
#> 7 Data Science with R Rstats4Econ 2012-04-21 04:37:12
#> 8 Baseball with R BaseballRstats 2013-11-02 16:07:05
#> 9 Will steelRstats 2019-07-23 16:48:00
#> 10 LIRR Statistics (Unofficial) LIRRstats 2017-01-25 00:31:55
#> # … with 90 more rows
If we want to know what have they tweeted about we can use
tweets_data()
:
<- tweets_data(useRs)
useRs_twt 1:5, c("id_str", "created_at", "text")]
useRs_twt[#> # A tibble: 5 × 3
#> id_str created_at text
#> <chr> <chr> <chr>
#> 1 1605298908707508224 Tue Dec 20 20:27:39 +0000 2022 "RT @meghansharris: If you're looking for me…
#> 2 1605295768717234176 Tue Dec 20 20:15:10 +0000 2022 "RT @Highcharts: A step-by-step tutorial to …
#> 3 1593655548091797505 Fri Nov 18 17:21:06 +0000 2022 "copying @lemonwatcher's tactic of twitter b…
#> 4 1605178921355706370 Tue Dec 20 12:30:52 +0000 2022 "Options:\n(Answer at: https://t.co/luCHh1j6…
#> 5 1605298970686902294 Tue Dec 20 20:27:54 +0000 2022 "RT @meghansharris: If you're looking for me…
Get the most recent tweets from R Foundation.
## get user IDs of accounts followed by R Foundation
<- get_timeline("_R_Foundation")
R_foundation_tline
## plot the frequency of tweets for each user over time
<- R_foundation_tline |>
plot filter(created_at > "2017-10-29") |>
ts_plot(by = "month", trim = 1L) +
geom_point() +
theme_minimal() +
theme(
legend.title = element_blank(),
legend.position = "bottom",
plot.title = element_text(face = "bold")) +
labs(
x = NULL, y = NULL,
title = "Frequency of Twitter statuses posted by the R Foundation",
subtitle = "Twitter status (tweet) counts aggregated by month from October/November 2017",
caption = "Source: Data collected from Twitter's REST API via rtweet"
)
Get the 10 recently favorited statuses by R Foundation.
<- get_favorites("_R_Foundation", n = 10)
R_foundation_favs c("text", "created_at", "id_str")]
R_foundation_favs[, #> Users data at users_data()
#> # A tibble: 10 × 3
#> text created_at id_str
#> <chr> <dttm> <chr>
#> 1 "We're into August, which hopefully means you've had time to enjoy … 2020-08-03 09:51:33 12901…
#> 2 "Gret meeting of #useR2020 passing the torch to #useR2021! 🔥 \nTha… 2020-07-16 17:14:25 12837…
#> 3 "Also thanks to the @_R_Foundation, @R_Forwards, @RLadiesGlobal, Mi… 2020-05-28 08:57:24 12658…
#> 4 "Such an honour to be acknowledged this way at #useR2019. I'm happy… 2019-07-12 18:36:27 11497…
#> 5 "R-3.4.4 Windows installer is on CRAN now: https://t.co/h35EcsIEuF … 2018-03-15 18:16:13 97433…
#> 6 "Gala dinner with a table with people in cosmology, finance, psycho… 2017-07-07 09:10:41 88322…
#> 7 "AMAZING #RLadies at #useR2017 💜🌍 inspiring #rstats work around t… 2017-07-05 13:25:27 88256…
#> 8 "Fame at last: https://t.co/x4wIePKR6b -- it's always nice to get a… 2017-06-07 23:25:37 87256…
#> 9 "We are excited to let you know that the full Conference Program is… 2017-05-31 14:37:23 86989…
#> 10 ". @statsYSS and @RSSGlasgow1 to hold joint event celebrating 20 y… 2017-04-10 10:50:11 85135…
Discover what’s currently trending in San Francisco.
<- get_trends("world")
world
world#> # A tibble: 50 × 9
#> trend url promo…¹ query tweet…² place woeid as_of created_at
#> <chr> <chr> <lgl> <chr> <int> <chr> <int> <dttm> <dttm>
#> 1 #MerciLesBleus http… NA %23M… NA Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 2 #CashAppGiftCa… http… NA %23C… 11513 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 3 #PromosyonFiya… http… NA %23P… 16068 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 4 Berni http… NA Berni 12538 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 5 #المتجر_الالكت… http… NA %23%… 12029 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 6 #سعودي_ايدول http… NA %23%… NA Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 7 Rosada http… NA Rosa… 115732 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 8 Larreta http… NA Larr… 16698 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 9 CABA http… NA CABA 46767 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> 10 Ochoa http… NA Ochoa 19026 Worl… 1 2022-12-20 21:06:02 2022-12-18 11:01:37
#> # … with 40 more rows, and abbreviated variable names ¹promoted_content, ²tweet_volume
You can follow users and unfollow them:
post_follow("_R_Foundation")
#> Response [https://api.twitter.com/1.1/friendships/create.json?notify=FALSE&screen_name=_R_Foundation]
#> Date: 2022-12-20 21:06
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 3.63 kB
post_unfollow_user("rtweet_test")
#> Response [https://api.twitter.com/1.1/friendships/destroy.json?notify=FALSE&screen_name=rtweet_test]
#> Date: 2022-12-20 21:06
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 3.3 kB
There are some functions to help analyze the data extracted from the API.
With clean_tweets()
you can remove users mentions,
hashtags, urls and media to keep only the text if you want to do
sentiment analysis (you might want to remove emojis too).
clean_tweets(head(R_foundation_favs), clean = c("users", "hashtags", "urls", "media"))
#> [1] "We're into August, which hopefully means you've had time to enjoy content from !\n\nPlease help us find out who participated in the conference and what you thought of it by answering our survey: ."
#> [2] "Gret meeting of passing the torch to ! 🔥 \nThank you so much, everyone!🙏🏽\nParticularly,\n🌟 from \n🌟, chair\n🌟 & , chairs\n🌟 & , @useR2021global chairs"
#> [3] "Also thanks to the , , , MiR and many others in supporting us in this endeavour!"
#> [4] "Such an honour to be acknowledged this way at . I'm happy that folks like , , , and so many others have got on board with my ideas for the community and helped them come to fruition - even better than I could imagine. 💜 "
#> [5] "R-3.4.4 Windows installer is on CRAN now: "
#> [6] "Gala dinner with a table with people in cosmology, finance, psychology, demography, medical doctor 😊"
With entity()
you can access any entity of the tweets.
It returns the id of the tweet and the corresponding data field in the
selected entity.
head(entity(R_foundation_favs, "urls"))
#> id_str url
#> 1 1290193576169803776 https://t.co/HYLl6rMySc
#> 2 1283782043021774850 <NA>
#> 3 1265899960228360195 <NA>
#> 4 1149719180314316800 https://t.co/dg2Dh49tug
#> 5 974333459085672448 https://t.co/h35EcsIEuF
#> 6 974333459085672448 https://t.co/7xko0aUS2w
#> expanded_url display_url
#> 1 http://bit.ly/useR2020survey bit.ly/useR2020survey
#> 2 <NA> <NA>
#> 3 <NA> <NA>
#> 4 https://twitter.com/alice_data/status/1149680375817494529 twitter.com/alice_data/sta…
#> 5 https://cran.r-project.org/bin/windows/base/ cran.r-project.org/bin/windows/ba…
#> 6 https://twitter.com/pdalgd/status/974214402097508353 twitter.com/pdalgd/status/…
#> unwound
#> 1 NA
#> 2 NA
#> 3 NA
#> 4 NA
#> 5 NA
#> 6 NA
head(entity(R_foundation_favs, "hashtags"))
#> id_str text
#> 1 1290193576169803776 useR2020
#> 2 1283782043021774850 useR2020
#> 3 1283782043021774850 useR2021
#> 4 1265899960228360195 <NA>
#> 5 1149719180314316800 useR2019
#> 6 1149719180314316800 rstats
head(entity(R_foundation_favs, "symbols"))
#> id_str text
#> 1 1290193576169803776 NA
#> 2 1283782043021774850 NA
#> 3 1265899960228360195 NA
#> 4 1149719180314316800 NA
#> 5 974333459085672448 NA
#> 6 883221715777720320 NA
head(entity(R_foundation_favs, "user_mentions"))
#> id_str screen_name name user_id
#> 1 1290193576169803776 <NA> <NA> NA
#> 2 1283782043021774850 HeathrTurnr Heather Turner is on strike 3.367337e+09
#> 3 1283782043021774850 _R_Foundation The R Foundation @R_Foundation@fosstodon.org 7.944582e+17
#> 4 1283782043021774850 HeidiBaya Heidi Seibold @HeidiSeibold@fosstodon.org 5.321151e+08
#> 5 1283782043021774850 useR2020muc useR2020muc 1.197875e+18
#> 6 1283782043021774850 chrisprener Chris Prener 5.075829e+08
#> user_id_str
#> 1 <NA>
#> 2 3367336625
#> 3 794458165987438592
#> 4 532115122
#> 5 1197874989367779328
#> 6 507582860
head(entity(R_foundation_favs, "media"))
#> id_str id id_str media_url media_url_https url display_url expanded_url type
#> 1 1290193576169803776 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 2 1283782043021774850 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 3 1265899960228360195 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 4 1149719180314316800 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 5 974333459085672448 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 6 883221715777720320 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> sizes ext_alt_text
#> 1 NA NA
#> 2 NA NA
#> 3 NA NA
#> 4 NA NA
#> 5 NA NA
#> 6 NA NA
To avoid having two columns with the same name,
user_mentions()
renames to the ids from “id” and “id_str”
to “user_id” and “user_id_str”.
You can mute and unmute users:
post_follow("rtweet_test", mute = TRUE)
post_follow("rtweet_test", mute = FALSE)
You can block users and unblock them:
user_block("RTweetTest1")
#> Response [https://api.twitter.com/1.1/blocks/create.json?screen_name=RTweetTest1]
#> Date: 2022-12-20 21:06
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 1.35 kB
user_unblock("RTweetTest1")
#> Response [https://api.twitter.com/1.1/blocks/destroy.json?screen_name=RTweetTest1]
#> Date: 2022-12-20 21:06
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 1.35 kB
Twitter sets a limited number of calls to their endpoints for
different authentications (check vignette("auth", "rtweet")
to find which one is better for your use case). To consult those limits
you can use rate_limt()
rate_limit()
#> # A tibble: 263 × 5
#> resource limit remaining reset_at reset
#> <chr> <int> <int> <dttm> <drtn>
#> 1 /lists/list 15 15 2022-12-20 22:21:03 15 mins
#> 2 /lists/:id/tweets&GET 900 900 2022-12-20 22:21:03 15 mins
#> 3 /lists/:id/followers&GET 180 180 2022-12-20 22:21:03 15 mins
#> 4 /lists/memberships 75 75 2022-12-20 22:21:03 15 mins
#> 5 /lists/:id&DELETE 300 300 2022-12-20 22:21:03 15 mins
#> 6 /lists/subscriptions 15 15 2022-12-20 22:21:03 15 mins
#> 7 /lists/members 900 900 2022-12-20 22:21:03 15 mins
#> 8 /lists/:id&GET 75 75 2022-12-20 22:21:03 15 mins
#> 9 /lists/subscribers/show 15 15 2022-12-20 22:21:03 15 mins
#> 10 /lists/:id&PUT 300 300 2022-12-20 22:21:03 15 mins
#> # … with 253 more rows
# Search only those related to followers
rate_limit("followers")
#> # A tibble: 5 × 5
#> resource limit remaining reset_at reset
#> <chr> <int> <int> <dttm> <drtn>
#> 1 /lists/:id/followers&GET 180 180 2022-12-20 22:21:04 15 mins
#> 2 /users/:id/followers 15 15 2022-12-20 22:21:04 15 mins
#> 3 /users/by/username/:username/followers 15 15 2022-12-20 22:21:04 15 mins
#> 4 /followers/ids 15 9 2022-12-20 22:20:50 15 mins
#> 5 /followers/list 15 15 2022-12-20 22:21:04 15 mins
The remaining column shows the number of times that you can call and endpoint (not the numbers of followers you can search). After a query the number should decrease until it is reset again.
If your queries return an error, check if you already exhausted your quota and try after the time on “reset_at”.
Please, see the vignette on vignette("stream", "rtweet")
for more information.
To provide real examples the vignette is precomputed before submission. Also note that results returned by the API will change.
sessionInfo()
#> R version 4.2.2 (2022-10-31)
#> Platform: x86_64-pc-linux-gnu (64-bit)
#> Running under: Ubuntu 22.04.1 LTS
#>
#> Matrix products: default
#> BLAS: /usr/lib/x86_64-linux-gnu/openblas-pthread/libblas.so.3
#> LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/libopenblasp-r0.3.20.so
#>
#> locale:
#> [1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C LC_TIME=es_ES.UTF-8
#> [4] LC_COLLATE=en_US.UTF-8 LC_MONETARY=es_ES.UTF-8 LC_MESSAGES=en_US.UTF-8
#> [7] LC_PAPER=es_ES.UTF-8 LC_NAME=C LC_ADDRESS=C
#> [10] LC_TELEPHONE=C LC_MEASUREMENT=es_ES.UTF-8 LC_IDENTIFICATION=C
#>
#> attached base packages:
#> [1] stats graphics grDevices utils datasets methods base
#>
#> other attached packages:
#> [1] dplyr_1.0.10 ggplot2_3.4.0 knitr_1.41 rtweet_1.0.2.9018
#> [5] BiocManager_1.30.19 cyclocomp_1.1.0 testthat_3.1.5 devtools_2.4.5
#> [9] usethis_2.1.6
#>
#> loaded via a namespace (and not attached):
#> [1] fs_1.5.2 bit64_4.0.5 progress_1.2.2 httr_1.4.4 rprojroot_2.0.3
#> [6] tools_4.2.2 profvis_0.3.7 utf8_1.2.2 R6_2.5.1 DBI_1.1.3
#> [11] colorspace_2.0-3 urlchecker_1.0.1 withr_2.5.0 tidyselect_1.2.0 prettyunits_1.1.1
#> [16] processx_3.8.0 bit_4.0.5 curl_4.3.3 compiler_4.2.2 httr2_0.2.2
#> [21] cli_3.4.1 webmockr_0.8.2 xml2_1.3.3 desc_1.4.2 labeling_0.4.2
#> [26] triebeard_0.3.0 scales_1.2.1 callr_3.7.3 askpass_1.1 rappdirs_0.3.3
#> [31] commonmark_1.8.1 stringr_1.5.0 digest_0.6.30 rmarkdown_2.18 base64enc_0.1-3
#> [36] pkgconfig_2.0.3 htmltools_0.5.3 sessioninfo_1.2.2 highr_0.9 fastmap_1.1.0
#> [41] htmlwidgets_1.5.4 rlang_1.0.6 rstudioapi_0.14 httpcode_0.3.0 shiny_1.7.3
#> [46] farver_2.1.1 generics_0.1.3 jsonlite_1.8.4 magrittr_2.0.3 fauxpas_0.5.0
#> [51] Rcpp_1.0.9 munsell_0.5.0 fansi_1.0.3 lifecycle_1.0.3 stringi_1.7.8
#> [56] whisker_0.4.1 yaml_2.3.6 brio_1.1.3 pkgbuild_1.4.0 grid_4.2.2
#> [61] promises_1.2.0.1 crayon_1.5.2 miniUI_0.1.1.1 hms_1.1.2 ps_1.7.2
#> [66] pillar_1.8.1 pkgload_1.3.2 crul_1.3 glue_1.6.2 evaluate_0.18
#> [71] remotes_2.4.2 vctrs_0.5.1 httpuv_1.6.6 urltools_1.7.3 gtable_0.3.1
#> [76] openssl_2.0.5 purrr_0.3.5 assertthat_0.2.1 cachem_1.0.6 xfun_0.35
#> [81] mime_0.12 xtable_1.8-4 roxygen2_7.2.2 later_1.3.0 vcr_1.2.0
#> [86] tibble_3.1.8 memoise_2.0.1 ellipsis_0.3.2