This vignette provides a quick tour of the R package.
Authenticate
First you should set up your own credentials, this should be done just once ever:
Which will look up your account on your browser and create a token and save it as default. From now on, on this R session on others we can use this authentication with:
auth_as("default")
Automatically rtweet will use that token in all the API queries it will do in the session.
If you want to set up a bot or collect a lot of information, please read the vignette("auth", "rtweet")
.
Search tweets
You can search tweets:
## search for 18000 tweets using the rstats hashtag
rstats <- search_tweets("#rstats", n = 100, include_rts = FALSE)
colnames(rstats)
#> [1] "created_at" "id"
#> [3] "id_str" "text"
#> [5] "full_text" "truncated"
#> [7] "entities" "source"
#> [9] "in_reply_to_status_id" "in_reply_to_status_id_str"
#> [11] "in_reply_to_user_id" "in_reply_to_user_id_str"
#> [13] "in_reply_to_screen_name" "geo"
#> [15] "coordinates" "place"
#> [17] "contributors" "is_quote_status"
#> [19] "retweet_count" "favorite_count"
#> [21] "favorited" "favorited_by"
#> [23] "retweeted" "scopes"
#> [25] "lang" "possibly_sensitive"
#> [27] "display_text_width" "display_text_range"
#> [29] "retweeted_status" "quoted_status"
#> [31] "quoted_status_id" "quoted_status_id_str"
#> [33] "quoted_status_permalink" "quote_count"
#> [35] "timestamp_ms" "reply_count"
#> [37] "filter_level" "metadata"
#> [39] "query" "withheld_scope"
#> [41] "withheld_copyright" "withheld_in_countries"
#> [43] "possibly_sensitive_appealable"
rstats[1:5, c("created_at", "text", "id_str")]
#> # A tibble: 5 × 3
#> created_at text id_str
#> <dttm> <chr> <chr>
#> 1 2023-01-10 10:00:22 "Mastering Deep Learning with Python: 2 Manuscripts in 1. #BigData #… 16127…
#> 2 2023-01-10 08:26:02 "Difference Between Deep Learning and Machine Learning. #BigData #An… 16127…
#> 3 2023-01-10 20:21:19 "The incredible graphic folks over at @FiveThirtyEight (e.g., @ryana… 16128…
#> 4 2023-01-11 20:28:18 "Science job l#DataScience #Cybersecurity #Analytics #AI #IIoT #Pyth… 16132…
#> 5 2023-01-11 20:24:59 "Day 70 of #100DaysOfCode \nFocused on how to interpret data plots w… 16132…
#> ℹ Users data at users_data()
The include_rts = FALSE
excludes retweets from the search.
Twitter rate limits the number of calls to the endpoints you can do. See rate_limit()
and the rate limit section below. If your query requires more calls like the example below, simply set retryonratelimit = TRUE
and rtweet will wait for rate limit resets for you.
## search for 250,000 tweets containing the word data
tweets_peace <- search_tweets("peace", n = 250000, retryonratelimit = TRUE)
Search by geo-location, for example tweets in the English language sent from the United States.
# search for tweets sent from the US
# lookup_coords requires Google maps API key for maps outside usa, canada and world
geo_tweets <- search_tweets("lang:en", geocode = lookup_coords("usa"), n = 100)
geo_tweets[1:5, c("created_at", "text", "id_str", "lang", "place")]
#> # A tibble: 5 × 5
#> created_at text id_str lang place
#> <dttm> <chr> <chr> <chr> <lis>
#> 1 2023-01-11 20:31:41 Me these past few weeks..🥲 https://t.co/dSXK3TzFu1 16132… en <df>
#> 2 2023-01-11 20:31:41 @tommybrooker This is our season! #BugNation https://t.c… 16132… en <df>
#> 3 2023-01-11 20:31:41 This team is fun https://t.co/tFbQZ5763P 16132… en <df>
#> 4 2023-01-11 20:31:41 Uh [email protected] already up to mischief... 16132… en <df>
#> 5 2023-01-11 20:31:41 coachella gon be crazyyyyy 🤝 16132… en <df>
#> ℹ Users data at users_data()
You can check the location of these tweets with lat_lng()
. Or quickly visualize frequency of tweets over time using ts_plot()
(if ggplot2
is installed).
## plot time series of tweets
ts_plot(rstats) +
theme_minimal() +
theme(plot.title = element_text(face = "bold")) +
labs(
x = NULL, y = NULL,
title = "Frequency of #rstats Twitter statuses from past 9 days",
subtitle = "Twitter status (tweet) counts aggregated using three-hour intervals",
caption = "Source: Data collected from Twitter's REST API via rtweet"
)

plot of chunk plot1
Posting statuses
You can post tweets with:
post_tweet(paste0("My first tweet with #rtweet #rstats at ", Sys.time()))
#> Your tweet has been posted!
It can include media and alt text:
path_file <- tempfile(fileext = ".png")
png(filename = path_file)
plot(mpg ~ cyl, mtcars, col = gear, pch = gear)
dev.off()
#> png
#> 2
post_tweet("my first tweet with #rtweet with media #rstats", media = path_file, media_alt_text = "Plot of mtcars dataset, showing cyl vs mpg colored by gear. The lower cyl the higher the mpg is.")
#> Your tweet has been posted!
You can also reply to a previous tweet, retweet and provide additional information.
Get friends
Retrieve a list of all the accounts a user follows.
## get user IDs of accounts followed by R Foundation
R_foundation_fds <- get_friends("_R_Foundation")
R_foundation_fds
#> # A tibble: 30 × 2
#> from_id to_id
#> <chr> <chr>
#> 1 _R_Foundation 1448728978370535426
#> 2 _R_Foundation 889777924991307778
#> 3 _R_Foundation 1300656590
#> 4 _R_Foundation 1280779280579022848
#> 5 _R_Foundation 1229418786085888001
#> 6 _R_Foundation 1197874989367779328
#> 7 _R_Foundation 1102763906714554368
#> 8 _R_Foundation 1560929287
#> 9 _R_Foundation 46782674
#> 10 _R_Foundation 16284661
#> # … with 20 more rows
Using get_friends()
we can retrieve which users are being followed by the R Foundation.
Get followers
If you really want all the users that follow the account we can use get_followers()
:
R_foundation_flw <- get_followers("_R_Foundation", n = 30000,
retryonratelimit = TRUE)
#> Downloading multiple pages ======================>----------------------------------------------
#> Downloading multiple pages =================================>-----------------------------------
#> Downloading multiple pages =============================================>-----------------------
#> Downloading multiple pages =========================================================>-----------
Note that the retryonratelimit
option is intended for when you need more queries than provided by Twitter on a given period. You might want to check with rate_limit()
how many does it provide for the endpoints you are using. If exceeded retryonratelimit
waits till the there are more calls available and then resumes the query.
Lookup users
As seen above we can use lookup_users()
to check their
# Look who is following R Foundation
R_foundation_fds_data <- lookup_users(R_foundation_fds$to_id, verbose = FALSE)
R_foundation_fds_data[, c("name", "screen_name", "created_at")]
#> # A tibble: 30 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 R Contributors R_Contributors 2021-10-14 21:15:12
#> 2 Sebastian Meyer bastistician 2017-07-25 11:22:43
#> 3 Naras b_naras 2013-03-25 19:48:12
#> 4 useR! 2022 _useRconf 2020-07-08 10:22:55
#> 5 useR2021zrh useR2021zrh 2020-02-17 15:54:39
#> 6 useR2020muc useR2020muc 2019-11-22 14:50:55
#> 7 useR! 2020 useR2020stl 2019-03-05 03:52:58
#> 8 Roger Bivand @[email protected] RogerBivand 2013-07-01 18:19:42
#> 9 Henrik Bengtsson henrikbengtsson 2009-06-13 02:11:14
#> 10 Gabriela de Queiroz ☁️ 🥑 (fosstodon.org/@kroz) gdequeiroz 2008-09-14 18:55:29
#> # … with 20 more rows
#> ℹ Tweets data at tweets_data()
# Look 100 R Foundation followers
R_foundation_flw_data <- lookup_users(head(R_foundation_flw$from_id, 100), verbose = FALSE)
R_foundation_flw_data[1:5, c("name", "screen_name", "created_at")]
#> # A tibble: 5 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 Junho CHOI econjunho 2021-05-06 07:31:25
#> 2 Ricardo Junqueira Ricardo25486602 2021-01-30 17:41:32
#> 3 lucapiov lukeorainy 2021-04-01 15:17:37
#> 4 Vipra Sharma ProtVipra 2022-09-24 09:14:18
#> 5 TRINQUE À LA NUIT COMME À LA MORT DarkblooM_SR 2020-07-14 18:42:30
#> ℹ Tweets data at tweets_data()
We have now the information from those followed by the R Foundation and its followers. We can retrieve their latest tweets from these users:
tweets_data(R_foundation_fds_data)[, c("created_at", "text")]
#> # A tibble: 30 × 2
#> created_at text
#> <chr> <chr>
#> 1 Mon Jan 09 11:13:53 +0000 2023 "Two (independent) sessions on Thursday Jan 9th:\n\n- 10am-11am…
#> 2 Wed Sep 28 15:06:22 +0000 2022 "RT @pdalgd: #rstats 4.2.2 \"Innocent and Trusting\" scheduled…
#> 3 Tue Nov 29 15:55:04 +0000 2022 "@stevenstrogatz Terry Tao also posted a nice note recently htt…
#> 4 Sat Nov 12 16:06:40 +0000 2022 "RT @HeathrTurnr: Wondering about setting up a data science/ope…
#> 5 Fri Jul 29 05:30:06 +0000 2022 "RT @kidssindwichtig: Wenn du in deiner Arbeitszeit noch nie ei…
#> 6 Fri Apr 16 11:03:21 +0000 2021 "RT @_useRconf: It is a good time to remember some of our keyda…
#> 7 Mon Jan 18 17:36:22 +0000 2021 "Give us a follow at @_useRconf to stay updated on *all* future…
#> 8 Thu Jan 05 12:26:46 +0000 2023 "RT @GdalOrg: GDAL 3.6.2 is released: https://t.co/MxQYuXgrdU"
#> 9 Wed Jan 11 05:27:35 +0000 2023 "{progressr} 0.13.0 on CRAN. Setting\n\noptions(cli.progress_ha…
#> 10 Tue Jan 10 20:51:48 +0000 2023 "RT @WirelessLife: Developers, learn how to avoid the limits on…
#> # … with 20 more rows
Search users
Search for 1,000 users with the rstats hashtag in their profile bios.
## search for users with #rstats in their profiles
useRs <- search_users("#rstats", n = 100, verbose = FALSE)
useRs[, c("name", "screen_name", "created_at")]
#> # A tibble: 100 × 3
#> name screen_name created_at
#> <chr> <chr> <dttm>
#> 1 Rstats rstatstweet 2018-06-27 05:45:02
#> 2 R for Data Science rstats4ds 2018-12-18 13:55:25
#> 3 FC rSTATS FC_rstats 2018-02-08 21:03:08
#> 4 #RStats Question A Day data_question 2019-10-21 19:15:24
#> 5 R Tweets rstats_tweets 2020-09-17 18:12:09
#> 6 Ramiro Bentes NbaInRstats 2019-11-05 03:44:32
#> 7 Data Science with R Rstats4Econ 2012-04-21 04:37:12
#> 8 Baseball with R BaseballRstats 2013-11-02 16:07:05
#> 9 Will steelRstats 2019-07-23 16:48:00
#> 10 LIRR Statistics (Unofficial) LIRRstats 2017-01-25 00:31:55
#> # … with 90 more rows
#> ℹ Tweets data at tweets_data()
If we want to know what have they tweeted about we can use tweets_data()
:
useRs_twt <- tweets_data(useRs)
useRs_twt[1:5, c("id_str", "created_at", "text")]
#> # A tibble: 5 × 3
#> id_str created_at text
#> <chr> <chr> <chr>
#> 1 1613255835848945665 Wed Jan 11 19:25:38 +0000 2023 "RT @JosechOpula: Day 70 of #100DaysOfCode \…
#> 2 1613256982135185418 Wed Jan 11 19:30:12 +0000 2023 "RT @amitisinvesting: Palantir & Posit, …
#> 3 1612766122084036610 Tue Jan 10 10:59:41 +0000 2023 "yep @petermckeever doing his thing https://…
#> 4 1613151454860283905 Wed Jan 11 12:30:52 +0000 2023 "4. 1 would give \"n\" \"n\" \"p\" and 2 wou…
#> 5 1613254456774107137 Wed Jan 11 19:20:09 +0000 2023 "RT @EconomicsShiny: Uncertainty Update: VIX…
Get timelines
Get the most recent tweets from R Foundation.
## get user IDs of accounts followed by R Foundation
R_foundation_tline <- get_timeline("_R_Foundation")
## plot the frequency of tweets for each user over time
plot <- R_foundation_tline |>
filter(created_at > "2017-10-29") |>
ts_plot(by = "month", trim = 1L) +
geom_point() +
theme_minimal() +
theme(
legend.title = element_blank(),
legend.position = "bottom",
plot.title = element_text(face = "bold")) +
labs(
x = NULL, y = NULL,
title = "Frequency of Twitter statuses posted by the R Foundation",
subtitle = "Twitter status (tweet) counts aggregated by month from October/November 2017",
caption = "Source: Data collected from Twitter's REST API via rtweet"
)
Get favorites
Get the 10 recently favorited statuses by R Foundation.
R_foundation_favs <- get_favorites("_R_Foundation", n = 10)
R_foundation_favs[, c("text", "created_at", "id_str")]
#> # A tibble: 10 × 3
#> text created_at id_str
#> <chr> <dttm> <chr>
#> 1 "We're into August, which hopefully means you've had time to enjoy … 2020-08-03 09:51:33 12901…
#> 2 "Gret meeting of #useR2020 passing the torch to #useR2021! 🔥 \nTha… 2020-07-16 17:14:25 12837…
#> 3 "Also thanks to the @_R_Foundation, @R_Forwards, @RLadiesGlobal, Mi… 2020-05-28 08:57:24 12658…
#> 4 "Such an honour to be acknowledged this way at #useR2019. I'm happy… 2019-07-12 18:36:27 11497…
#> 5 "R-3.4.4 Windows installer is on CRAN now: https://t.co/h35EcsIEuF … 2018-03-15 18:16:13 97433…
#> 6 "Gala dinner with a table with people in cosmology, finance, psycho… 2017-07-07 09:10:41 88322…
#> 7 "AMAZING #RLadies at #useR2017 💜🌍 inspiring #rstats work around t… 2017-07-05 13:25:27 88256…
#> 8 "Fame at last: https://t.co/x4wIePKR6b -- it's always nice to get a… 2017-06-07 23:25:37 87256…
#> 9 "We are excited to let you know that the full Conference Program is… 2017-05-31 14:37:23 86989…
#> 10 ". @statsYSS and @RSSGlasgow1 to hold joint event celebrating 20 y… 2017-04-10 10:50:11 85135…
#> ℹ Users data at users_data()
Get trends
Discover what’s currently trending in San Francisco.
world <- get_trends("world")
world
#> # A tibble: 50 × 9
#> trend url promo…¹ query tweet…² place woeid as_of created_at
#> <chr> <chr> <lgl> <chr> <int> <chr> <int> <dttm> <dttm>
#> 1 #BecasEdomex http… NA %23B… 49048 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 2 #iHeartAwards http… NA %23i… 746169 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 3 #BestMusicVideo http… NA %23B… 374249 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 4 #BestFanArmy http… NA %23B… 215723 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 5 LOVE YOU ALBA http… NA %22L… 67091 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 6 #Louies http… NA %23L… 46855 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 7 Sara Manuela http… NA %22S… NA Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 8 FELiPLAY LEVEL… http… NA %22F… 17419 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 9 Naomi Osaka http… NA %22N… NA Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> 10 Shakira http… NA Shak… 212478 Worl… 1 2023-01-11 19:31:58 2023-01-10 13:39:07
#> # … with 40 more rows, and abbreviated variable names ¹promoted_content, ²tweet_volume
Following users
You can follow users and unfollow them:
post_follow("_R_Foundation")
#> Response [https://api.twitter.com/1.1/friendships/create.json?notify=FALSE&screen_name=_R_Foundation]
#> Date: 2023-01-11 19:31
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 2.76 kB
post_unfollow_user("rtweet_test")
#> Response [https://api.twitter.com/1.1/friendships/destroy.json?notify=FALSE&screen_name=rtweet_test]
#> Date: 2023-01-11 19:31
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 3.3 kB
Managing data
There are some functions to help analyze the data extracted from the API.
With clean_tweets()
you can remove users mentions, hashtags, urls and media to keep only the text if you want to do sentiment analysis (you might want to remove emojis too).
clean_tweets(head(R_foundation_favs), clean = c("users", "hashtags", "urls", "media"))
#> [1] "We're into August, which hopefully means you've had time to enjoy content from !\n\nPlease help us find out who participated in the conference and what you thought of it by answering our survey: ."
#> [2] "Gret meeting of passing the torch to ! 🔥 \nThank you so much, everyone!🙏🏽\nParticularly,\n🌟 from \n🌟, chair\n🌟 & , chairs\n🌟 & , @useR2021global chairs"
#> [3] "Also thanks to the , , , MiR and many others in supporting us in this endeavour!"
#> [4] "Such an honour to be acknowledged this way at . I'm happy that folks like , , , and so many others have got on board with my ideas for the community and helped them come to fruition - even better than I could imagine. 💜 "
#> [5] "R-3.4.4 Windows installer is on CRAN now: "
#> [6] "Gala dinner with a table with people in cosmology, finance, psychology, demography, medical doctor 😊"
With entity()
you can access any entity of the tweets. It returns the id of the tweet and the corresponding data field in the selected entity.
head(entity(R_foundation_favs, "urls"))
#> id_str url
#> 1 1290193576169803776 https://t.co/HYLl6rMySc
#> 2 1283782043021774850 <NA>
#> 3 1265899960228360195 <NA>
#> 4 1149719180314316800 https://t.co/dg2Dh49tug
#> 5 974333459085672448 https://t.co/h35EcsIEuF
#> 6 974333459085672448 https://t.co/7xko0aUS2w
#> expanded_url display_url
#> 1 http://bit.ly/useR2020survey bit.ly/useR2020survey
#> 2 <NA> <NA>
#> 3 <NA> <NA>
#> 4 https://twitter.com/alice_data/status/1149680375817494529 twitter.com/alice_data/sta…
#> 5 https://cran.r-project.org/bin/windows/base/ cran.r-project.org/bin/windows/ba…
#> 6 https://twitter.com/pdalgd/status/974214402097508353 twitter.com/pdalgd/status/…
#> unwound
#> 1 NA
#> 2 NA
#> 3 NA
#> 4 NA
#> 5 NA
#> 6 NA
head(entity(R_foundation_favs, "hashtags"))
#> id_str text
#> 1 1290193576169803776 useR2020
#> 2 1283782043021774850 useR2020
#> 3 1283782043021774850 useR2021
#> 4 1265899960228360195 <NA>
#> 5 1149719180314316800 useR2019
#> 6 1149719180314316800 rstats
head(entity(R_foundation_favs, "symbols"))
#> id_str text
#> 1 1290193576169803776 NA
#> 2 1283782043021774850 NA
#> 3 1265899960228360195 NA
#> 4 1149719180314316800 NA
#> 5 974333459085672448 NA
#> 6 883221715777720320 NA
head(entity(R_foundation_favs, "user_mentions"))
#> id_str screen_name name user_id
#> 1 1290193576169803776 <NA> <NA> NA
#> 2 1283782043021774850 HeathrTurnr Heather Turner 3.367337e+09
#> 3 1283782043021774850 _R_Foundation The R Foundation 7.944582e+17
#> 4 1283782043021774850 HeidiBaya Heidi Seibold @[email protected] 5.321151e+08
#> 5 1283782043021774850 useR2020muc useR2020muc 1.197875e+18
#> 6 1283782043021774850 chrisprener Chris Prener 5.075829e+08
#> user_id_str
#> 1 <NA>
#> 2 3367336625
#> 3 794458165987438592
#> 4 532115122
#> 5 1197874989367779328
#> 6 507582860
head(entity(R_foundation_favs, "media"))
#> id_str id id_str media_url media_url_https url display_url expanded_url type
#> 1 1290193576169803776 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 2 1283782043021774850 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 3 1265899960228360195 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 4 1149719180314316800 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 5 974333459085672448 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> 6 883221715777720320 NA <NA> <NA> <NA> <NA> <NA> <NA> <NA>
#> sizes ext_alt_text
#> 1 NA NA
#> 2 NA NA
#> 3 NA NA
#> 4 NA NA
#> 5 NA NA
#> 6 NA NA
To avoid having two columns with the same name, user_mentions()
renames to the ids from “id” and “id_str” to “user_id” and “user_id_str”.
Muting users
You can mute and unmute users:
post_follow("rtweet_test", mute = TRUE)
post_follow("rtweet_test", mute = FALSE)
Blocking users
You can block users and unblock them:
user_block("RTweetTest1")
#> Response [https://api.twitter.com/1.1/blocks/create.json?screen_name=RTweetTest1]
#> Date: 2023-01-11 19:31
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 1.35 kB
user_unblock("RTweetTest1")
#> Response [https://api.twitter.com/1.1/blocks/destroy.json?screen_name=RTweetTest1]
#> Date: 2023-01-11 19:31
#> Status: 200
#> Content-Type: application/json;charset=utf-8
#> Size: 1.35 kB
Rate limits
Twitter sets a limited number of calls to their endpoints for different authentications (check vignette("auth", "rtweet")
to find which one is better for your use case). To consult those limits you can use rate_limt()
rate_limit()
#> # A tibble: 263 × 5
#> resource limit remaining reset_at reset
#> <chr> <int> <int> <dttm> <drtn>
#> 1 /lists/list 15 15 2023-01-11 20:46:59 15 mins
#> 2 /lists/:id/tweets&GET 900 900 2023-01-11 20:46:59 15 mins
#> 3 /lists/:id/followers&GET 180 180 2023-01-11 20:46:59 15 mins
#> 4 /lists/memberships 75 75 2023-01-11 20:46:59 15 mins
#> 5 /lists/:id&DELETE 300 300 2023-01-11 20:46:59 15 mins
#> 6 /lists/subscriptions 15 15 2023-01-11 20:46:59 15 mins
#> 7 /lists/members 900 900 2023-01-11 20:46:59 15 mins
#> 8 /lists/:id&GET 75 75 2023-01-11 20:46:59 15 mins
#> 9 /lists/subscribers/show 15 15 2023-01-11 20:46:59 15 mins
#> 10 /lists/:id&PUT 300 300 2023-01-11 20:46:59 15 mins
#> # … with 253 more rows
# Search only those related to followers
rate_limit("followers")
#> # A tibble: 5 × 5
#> resource limit remaining reset_at reset
#> <chr> <int> <int> <dttm> <drtn>
#> 1 /lists/:id/followers&GET 180 180 2023-01-11 20:47:00 15 mins
#> 2 /users/:id/followers 15 15 2023-01-11 20:47:00 15 mins
#> 3 /users/by/username/:username/followers 15 15 2023-01-11 20:47:00 15 mins
#> 4 /followers/ids 15 3 2023-01-11 20:40:32 9 mins
#> 5 /followers/list 15 15 2023-01-11 20:47:00 15 mins
The remaining column shows the number of times that you can call and endpoint (not the numbers of followers you can search). After a query the number should decrease until it is reset again.
If your queries return an error, check if you already exhausted your quota and try after the time on “reset_at”.
Stream tweets
Please, see the vignette on vignette("stream", "rtweet")
for more information.
SessionInfo
To provide real examples the vignette is precomputed before submission. Also note that results returned by the API will change.
sessionInfo()
#> R version 4.2.2 (2022-10-31)
#> Platform: x86_64-pc-linux-gnu (64-bit)
#> Running under: Ubuntu 22.04.1 LTS
#>
#> Matrix products: default
#> BLAS: /usr/lib/x86_64-linux-gnu/openblas-pthread/libblas.so.3
#> LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/libopenblasp-r0.3.20.so
#>
#> locale:
#> [1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C LC_TIME=es_ES.UTF-8
#> [4] LC_COLLATE=en_US.UTF-8 LC_MONETARY=es_ES.UTF-8 LC_MESSAGES=en_US.UTF-8
#> [7] LC_PAPER=es_ES.UTF-8 LC_NAME=C LC_ADDRESS=C
#> [10] LC_TELEPHONE=C LC_MEASUREMENT=es_ES.UTF-8 LC_IDENTIFICATION=C
#>
#> attached base packages:
#> [1] stats graphics grDevices utils datasets methods base
#>
#> other attached packages:
#> [1] dplyr_1.0.10 ggplot2_3.4.0 rtweet_1.1.0.9000 BiocManager_1.30.19
#> [5] cyclocomp_1.1.0 testthat_3.1.5 devtools_2.4.5 usethis_2.1.6
#>
#> loaded via a namespace (and not attached):
#> [1] fs_1.5.2 bit64_4.0.5 progress_1.2.2 httr_1.4.4 rprojroot_2.0.3
#> [6] tools_4.2.2 profvis_0.3.7 utf8_1.2.2 R6_2.5.1 DBI_1.1.3
#> [11] colorspace_2.0-3 urlchecker_1.0.1 withr_2.5.0 tidyselect_1.2.0 prettyunits_1.1.1
#> [16] processx_3.8.0 bit_4.0.5 curl_4.3.3 compiler_4.2.2 httr2_0.2.2
#> [21] cli_3.4.1 webmockr_0.8.2 desc_1.4.2 labeling_0.4.2 triebeard_0.3.0
#> [26] scales_1.2.1 callr_3.7.3 askpass_1.1 rappdirs_0.3.3 stringr_1.5.0
#> [31] digest_0.6.30 rmarkdown_2.19 base64enc_0.1-3 pkgconfig_2.0.3 htmltools_0.5.3
#> [36] sessioninfo_1.2.2 highr_0.9 fastmap_1.1.0 htmlwidgets_1.5.4 rlang_1.0.6
#> [41] rstudioapi_0.14 httpcode_0.3.0 shiny_1.7.3 generics_0.1.3 farver_2.1.1
#> [46] jsonlite_1.8.4 magrittr_2.0.3 fauxpas_0.5.0 Rcpp_1.0.9 munsell_0.5.0
#> [51] fansi_1.0.3 lifecycle_1.0.3 stringi_1.7.8 whisker_0.4.1 yaml_2.3.6
#> [56] brio_1.1.3 pkgbuild_1.4.0 grid_4.2.2 promises_1.2.0.1 crayon_1.5.2
#> [61] miniUI_0.1.1.1 hms_1.1.2 knitr_1.41 ps_1.7.2 pillar_1.8.1
#> [66] pkgload_1.3.2 crul_1.3 glue_1.6.2 evaluate_0.18 remotes_2.4.2
#> [71] vctrs_0.5.1 httpuv_1.6.6 urltools_1.7.3 gtable_0.3.1 openssl_2.0.5
#> [76] purrr_0.3.5 assertthat_0.2.1 cachem_1.0.6 xfun_0.35 mime_0.12
#> [81] xtable_1.8-4 later_1.3.0 vcr_1.2.0 tibble_3.1.8 memoise_2.0.1
#> [86] ellipsis_0.3.2