Compare commits

...

32 commits
2.2.1 ... main

Author SHA1 Message Date
041590a559
Release 3.2.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 39s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 6s
2024-10-22 16:15:19 +02:00
d6f883f890
Bump API version
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
2024-10-22 16:14:46 +02:00
b07420e0bd Merge pull request 'Bump forgejo-release to v2' (#26) from actions-update into main
All checks were successful
Run Tests on Code / run-tests (push) Successful in 19s
Reviewed-on: https://forgejo.neshweb.net///Neshura/aob-lemmy-bot/pulls/26
2024-08-06 12:23:26 +00:00
0fc71f0a7d Bump forgejo-release to v2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 1m7s
Build binary file and bundle packages / test (pull_request) Successful in 1m6s
Build binary file and bundle packages / build (pull_request) Successful in 1m10s
2024-08-06 12:19:37 +00:00
2ae6468ad8
Release 3.1.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 47s
Build and Release Binary File / test (push) Successful in 16s
Build and Release Binary File / build (push) Successful in 48s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 37s
2024-07-15 22:15:49 +02:00
2ecfe88cb9
Update to 0.19.5 and include optional thumbnail 2024-07-15 22:15:31 +02:00
7dcc7bfee2
Release 3.0.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-08 16:34:48 +02:00
94d8a4e673
Add Timeout to Status Ping HTTP Request
All checks were successful
Run Tests on Code / run-tests (push) Successful in 14s
2024-05-08 16:34:32 +02:00
1b585eab7e
Release 3.0.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 37s
Build and Release Binary File / test (push) Successful in 37s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-07 23:50:14 +02:00
b6f5c38e4a
Overhaul Error handling (Option instead of Result<T, ()> + Logging changes
All checks were successful
Run Tests on Code / run-tests (push) Successful in 15s
2024-05-07 23:49:55 +02:00
5d708bdb82
Release 3.0.1
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 34s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:50:22 +02:00
6a8c1662f0
Fix enum problems in config
All checks were successful
Run Tests on Code / run-tests (push) Successful in 16s
2024-05-07 22:50:12 +02:00
e02cd900ed
Release 3.0.0
All checks were successful
Run Tests on Code / run-tests (push) Successful in 41s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:35:15 +02:00
32ea83a7bb
Release Candidate 3.0.0-rc.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 2s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-07 22:30:02 +02:00
4297860b9e
Legacy fixes for async traits
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
2024-05-07 22:29:48 +02:00
affe62b973
Release Candidate 3.0.0-rc.1
Some checks failed
Build and Release Binary File / test (push) Failing after 27s
Run Tests on Code / run-tests (push) Failing after 39s
Build and Release Binary File / build (push) Has been skipped
Build and Release Binary File / upload-generic-package (push) Has been skipped
Build and Release Binary File / upload-debian-package (push) Has been skipped
Build and Release Binary File / upload-release (push) Has been skipped
2024-05-07 22:11:16 +02:00
6520cc65a3
Rewrite Version 3
Some checks failed
Run Tests on Code / run-tests (push) Failing after 36s
2024-05-07 22:10:21 +02:00
aefceda628
Release 2.2.8
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:57:55 +02:00
ee5a159431
Yet another another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:41 +02:00
85f8b97607
Yet another logging fix
All checks were successful
Run Tests on Code / run-tests (push) Successful in 11s
2024-05-06 22:57:05 +02:00
966dd8f359
Release 2.2.7
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 31s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 7s
2024-05-06 22:54:15 +02:00
17e161bc27
Small fixes to logging 2024-05-06 22:54:01 +02:00
3928367692
Release 2.2.6
All checks were successful
Run Tests on Code / run-tests (push) Successful in 32s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:36:33 +02:00
070eae961a
Yet more logging and delay between series handling requests (should avoid potential rate limit)
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:36:21 +02:00
92103e28ba
Release 2.2.5
All checks were successful
Run Tests on Code / run-tests (push) Successful in 35s
Build and Release Binary File / test (push) Successful in 33s
Build and Release Binary File / build (push) Successful in 35s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 2s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 22:21:14 +02:00
cd78d3c1c7
Reduced connection timeouts and added some logging
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 22:21:04 +02:00
22bbaaa002
Release 2.2.4
All checks were successful
Run Tests on Code / run-tests (push) Successful in 36s
Build and Release Binary File / test (push) Successful in 35s
Build and Release Binary File / build (push) Successful in 36s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:53:14 +02:00
9ee9db5792
Dependency bumps
Some checks failed
Run Tests on Code / run-tests (push) Has been cancelled
2024-05-06 21:53:02 +02:00
23ac0de189
Release 2.2.3
All checks were successful
Run Tests on Code / run-tests (push) Successful in 34s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 37s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 11s
2024-05-06 21:28:00 +02:00
2dc695577e
Further Bugfixing 2024-05-06 21:27:47 +02:00
534a8022a9
Release 2.2.2
All checks were successful
Run Tests on Code / run-tests (push) Successful in 33s
Build and Release Binary File / test (push) Successful in 32s
Build and Release Binary File / build (push) Successful in 38s
Build and Release Binary File / upload-generic-package (push) Successful in 1s
Build and Release Binary File / upload-debian-package (push) Successful in 1s
Build and Release Binary File / upload-release (push) Successful in 8s
2024-05-06 21:17:08 +02:00
45bfca8cc5
Bugfix due to bad deduplication
All checks were successful
Run Tests on Code / run-tests (push) Successful in 13s
2024-05-06 21:16:53 +02:00
9 changed files with 1337 additions and 677 deletions

View file

@ -137,7 +137,7 @@ jobs:
run: rm release_blobs/build.env
-
name: Release New Version
uses: actions/forgejo-release@v1
uses: actions/forgejo-release@v2
with:
direction: upload
url: https://forgejo.neshweb.net

721
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,7 +1,7 @@
[package]
authors = ["Neshura"]
name = "aob-lemmy-bot"
version = "2.2.1"
version = "3.2.0"
edition = "2021"
description = "Bot for automatically posting new chapters of 'Ascendance of a Bookworm' released by J-Novel Club"
license = "GPL-3.0-or-later"
@ -16,19 +16,20 @@ systemd-units = { enable = false }
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
chrono = "^0.4.26"
lemmy_api_common = "0.19.3"
lemmy_db_schema = "0.19.3"
once_cell = "^1.18.0"
reqwest = { version = "^0.11.18", features = ["blocking", "json"] }
serde = "^1.0.164"
serde_derive = "^1.0.164"
serde_json = "^1.0.97"
strum_macros = "^0.25.0"
tokio = { version = "^1.32.0", features = ["rt", "rt-multi-thread", "macros"] }
url = "^2.4.0"
confy = "^0.5.1"
toml = "^0.8.8"
chrono = "^0.4"
lemmy_api_common = "0.19.5"
lemmy_db_schema = "0.19.5"
once_cell = "^1.19"
reqwest = { version = "^0.12", features = ["blocking", "json"] }
serde = "^1.0"
serde_derive = "^1.0"
serde_json = "^1.0"
strum_macros = "^0.26"
tokio = { version = "^1.37", features = ["rt", "rt-multi-thread", "macros"] }
url = "^2.5"
confy = "^0.6"
toml = "^0.8"
systemd-journal-logger = "^2.1.1"
log = "^0.4.20"
async-trait = "^0.1.77"
log = "^0.4"
async-trait = "^0.1"
notify = "6.1.1"

View file

@ -1,16 +1,21 @@
use crate::{config::{Config, PostBody, SeriesConfig}, fetchers::{jnovel}, lemmy};
use crate::fetchers::jnovel::JPostInfo;
use crate::lemmy::{Lemmy, PostInfo};
use crate::post_history::SeriesHistory;
use chrono::{DateTime, Duration, Timelike, Utc};
use lemmy_api_common::post::CreatePost;
use lemmy_db_schema::newtypes::{CommunityId, LanguageId};
use lemmy_db_schema::PostFeatureType;
use std::collections::HashMap;
use crate::{config::{Config}, HTTP_CLIENT};
use crate::lemmy::{Lemmy};
use crate::post_history::{SeriesHistory};
use chrono::{DateTime, Duration, Utc};
use std::sync::{Arc, RwLock};
use notify::{Event, EventKind, event::{AccessKind, AccessMode}, RecursiveMode, Watcher};
use tokio::time::sleep;
use crate::fetchers::Fetcher;
use systemd_journal_logger::connected_to_journal;
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
@ -20,15 +25,6 @@ macro_rules! info {
};
}
macro_rules! warn {
($msg:tt) => {
match connected_to_journal() {
true => log::warn!("[WARN] {}", $msg),
false => println!("[WARN] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
@ -38,226 +34,103 @@ macro_rules! error {
};
}
pub(crate) async fn run() {
let mut last_reload: DateTime<Utc>;
let mut lemmy: Lemmy;
let mut login_error: bool;
let mut communities: HashMap<String, CommunityId>;
let mut post_history: SeriesHistory;
let mut start: DateTime<Utc>;
let mut config: Config = Config::load();
last_reload = Utc::now();
pub(crate) struct Bot {
shared_config: Arc<RwLock<Config>>,
history: SeriesHistory,
run_start_time: DateTime<Utc>
}
lemmy = match lemmy::login(&config).await {
Ok(data) => data,
Err(_) => panic!(),
};
login_error = false;
enum Wait {
Absolute,
Buffer
}
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => panic!(),
};
impl Bot {
pub(crate) fn new() -> Self {
let config = Config::load();
let shared_config: Arc<RwLock<Config>> = Arc::new(RwLock::new(config));
start = Utc::now();
let shared_config_copy = shared_config.clone();
let mut watcher = notify::recommended_watcher(move |res: Result<Event, notify::Error>| {
match res {
Ok(event) => {
if event.kind == EventKind::Access(AccessKind::Close(AccessMode::Write)) {
let mut write = shared_config_copy.write().expect("Write Lock Failed");
let new_config = Config::load();
write.series = new_config.series;
write.instance = new_config.instance;
write.protected_communities = new_config.protected_communities;
write.status_post_url = new_config.status_post_url;
info!("Reloaded Configuration");
}
},
Err(e) => {
let msg = format!("Error watching files: {e}");
error!(msg);
}
}
}).expect("Watcher Error");
let info_msg = "Bot init successful, starting normal operations".to_owned();
info!(info_msg);
watcher.watch(&Config::get_path(), RecursiveMode::NonRecursive).expect("Error in watcher");
loop {
idle(&start, &config).await;
start = Utc::now();
let history: SeriesHistory = SeriesHistory::load_history();
// replace with watcher
if start - last_reload >= Duration::seconds(config.config_reload_seconds as i64) {
config = Config::load();
let message = "Config reloaded".to_owned();
info!(message);
}
if login_error {
let info_msg = "Login invalid, refreshing session";
info!(info_msg);
lemmy = match lemmy::login(&config).await {
Bot { shared_config, history, run_start_time: Utc::now() }
}
pub(crate) async fn run(&mut self) {
loop {
let mut lemmy = match Lemmy::new(&self.shared_config).await {
Ok(data) => data,
Err(_) => continue,
};
login_error = false;
}
if start - last_reload >= Duration::seconds(config.config_reload_seconds as i64) {
communities = match lemmy.get_communities().await {
Ok(data) => data,
Err(_) => {
login_error = true;
continue;
lemmy.get_communities().await;
self.history = SeriesHistory::load_history();
let start: DateTime<Utc> = Utc::now();
while Utc::now() - start <= Duration::minutes(60) {
self.run_start_time = Utc::now();
self.ping_status().await;
let read_copy = self.shared_config.read().expect("Read Lock Failed").clone();
for series in read_copy.series {
series.update(&mut self.history, &lemmy, &self.shared_config).await;
debug!("Done Updating Series");
self.wait(1, Wait::Absolute).await;
}
};
let message = "Communities reloaded".to_owned();
info!(message);
last_reload = Utc::now();
}
post_history = SeriesHistory::load_history();
let series = config.series.clone();
for series in series {
if handle_series(&series, &communities, &lemmy, &config, &mut post_history)
.await
.is_err()
{
login_error = true;
continue;
};
}
idle(&start, &config).await;
}
}
async fn idle(start: &DateTime<Utc>, config: &Config) {
let mut sleep_duration = Duration::seconds(30);
if Utc::now() - start > sleep_duration {
sleep_duration = Duration::seconds(60);
}
if let Some(status_url) = config.status_post_url.clone() {
match reqwest::get(status_url).await {
Ok(_) => {}
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
debug!("Awaiting Timeout");
self.wait(30, Wait::Buffer).await;
debug!("Pinging Server");
self.ping_status().await;
debug!("Awaiting Timeout 2");
self.wait(30, Wait::Absolute).await;
}
lemmy.logout().await;
}
};
while Utc::now() - start < sleep_duration {
sleep(Duration::milliseconds(100).to_std().unwrap()).await;
}
}
async fn handle_series(series: &SeriesConfig, communities: &HashMap<String, CommunityId>, lemmy: &Lemmy, config: &Config, post_history: &mut SeriesHistory ) -> Result<(), ()> {
let jnc = jnovel::JFetcherOptions::new(series.slug.clone(), series.parted);
let post_list = match jnc.check_feed().await {
Ok(data) => data,
Err(_) => return Err(()),
};
if post_list.is_empty() && Utc::now().minute() % 10 == 0 {
let info_msg = "No Updates found";
info!(info_msg);
}
for post_info in post_list.clone().iter() {
let post_part_info = post_info.get_part_info();
let post_lemmy_info = post_info.get_info();
if post_history.check_for_post(
series.slug.as_str(),
post_part_info.as_string().as_str(),
post_lemmy_info.title.as_str(),
) {
continue;
}
let post_series_config = match post_info {
JPostInfo::Chapter { .. } => &series.prepub_community,
JPostInfo::Volume { .. } => &series.volume_community,
};
let community_id = *communities
.get(post_series_config.name.as_str())
.expect("Given community is invalid");
let post_body = match &post_series_config.post_body {
PostBody::None => None,
PostBody::Description => post_info.get_description(),
PostBody::Custom(text) => Some(text.clone()),
};
let post_data = CreatePost {
name: post_lemmy_info.title.clone(),
community_id,
url: Some(post_lemmy_info.url),
body: post_body,
honeypot: None,
nsfw: None,
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future
};
let info = format!(
"Posting '{}' to {}",
post_lemmy_info.title.as_str(),
post_series_config.name.as_str()
);
info!(info);
let post_id = lemmy.post(post_data).await?;
if post_series_config.pin_settings.pin_new_post_community
&& config
.protected_communities
.contains(&post_series_config.name)
{
let info = format!(
"Pinning '{}' to {}",
post_lemmy_info.title,
post_series_config.name.as_str()
);
info!(info);
let pinned_posts = lemmy.get_community_pinned(community_id).await?;
if !pinned_posts.is_empty() {
let community_pinned_post = &pinned_posts[0];
lemmy
.unpin(community_pinned_post.post.id, PostFeatureType::Community)
.await?;
}
lemmy.pin(post_id, PostFeatureType::Community).await?;
} else if config
.protected_communities
.contains(&post_series_config.name)
{
let message = format!(
"Community '{}' for Series '{}' is protected. Is this intended?",
&post_series_config.name, series.slug
);
warn!(message);
}
if post_series_config.pin_settings.pin_new_post_local {
let info = format!("Pinning '{}' to Instance", post_lemmy_info.title);
info!(info);
let pinned_posts = lemmy.get_local_pinned().await?;
if !pinned_posts.is_empty() {
for pinned_post in pinned_posts {
if config
.protected_communities
.contains(&pinned_post.community.name)
{
continue;
} else {
let community_pinned_post = &pinned_post;
lemmy
.unpin(community_pinned_post.post.id, PostFeatureType::Local)
.await?;
break;
}
async fn ping_status(&self) {
let read_config = &self.shared_config.read().expect("Read Lock Failed").clone();
if let Some(status_url) = &read_config.status_post_url {
match HTTP_CLIENT.get(status_url).send().await {
Ok(_) => {},
Err(e) => {
let err_msg = format!("While pinging status URL: {e}");
error!(err_msg);
}
}
lemmy.pin(post_id, PostFeatureType::Local).await?;
}
let mut series_history = post_history.get_series(series.slug.as_str());
let mut part_history = series_history.get_part(post_part_info.as_string().as_str());
match post_info {
JPostInfo::Chapter { .. } => part_history.chapter = post_info.get_info().title,
JPostInfo::Volume { .. } => part_history.volume = post_info.get_info().title,
}
series_history.set_part(post_part_info.as_string().as_str(), part_history);
post_history
.set_series(series.slug.as_str(), series_history);
post_history.save_history();
}
Ok(())
async fn wait(&self, seconds: i64, start_time: Wait) {
let duration: Duration = Duration::seconds(seconds);
let start_time: DateTime<Utc> = match start_time {
Wait::Absolute => Utc::now(),
Wait::Buffer => self.run_start_time,
};
while Utc::now() - start_time < duration {
sleep(Duration::milliseconds(100).to_std().unwrap()).await
}
}
}

View file

@ -1,12 +1,57 @@
use std::path::PathBuf;
use std::sync::{Arc, RwLock};
use chrono::{Timelike, Utc};
use crate::config::PostBody::Description;
use lemmy_api_common::sensitive::Sensitive;
use lemmy_db_schema::PostFeatureType;
use lemmy_db_schema::sensitive::SensitiveString;
use serde_derive::{Deserialize, Serialize};
use crate::lemmy::{Lemmy, PartInfo, PostType};
use crate::post_history::{SeriesHistory};
use systemd_journal_logger::connected_to_journal;
use crate::fetchers::{FetcherTrait, Fetcher};
use crate::fetchers::jnovel::{JNovelFetcher};
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
macro_rules! info {
($msg:tt) => {
match connected_to_journal() {
true => log::info!("[INFO] {}", $msg),
false => println!("[INFO] {}", $msg),
}
};
}
macro_rules! warn {
($msg:tt) => {
match connected_to_journal() {
true => log::warn!("[WARN] {}", $msg),
false => println!("[WARN] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
true => log::error!("[ERROR] {}", $msg),
false => eprintln!("[ERROR] {}", $msg),
}
};
}
#[derive(Serialize, Deserialize, Clone, Debug)]
pub(crate) struct Config {
pub(crate) instance: String,
username: String,
password: String,
username: SensitiveString,
password: SensitiveString,
pub(crate) status_post_url: Option<String>,
pub(crate) config_reload_seconds: u32,
pub(crate) protected_communities: Vec<String>,
@ -40,12 +85,16 @@ impl Config {
cfg
}
pub(crate) fn get_username(&self) -> Sensitive<String> {
Sensitive::new(self.username.clone())
pub(crate) fn get_path() -> PathBuf {
confy::get_configuration_file_path(env!("CARGO_PKG_NAME"), "config").expect("Application will not without confy")
}
pub(crate) fn get_password(&self) -> Sensitive<String> {
Sensitive::new(self.password.clone())
pub(crate) fn get_username(&self) -> SensitiveString {
self.username.clone()
}
pub(crate) fn get_password(&self) -> SensitiveString {
self.password.clone()
}
}
@ -53,8 +102,8 @@ impl Default for Config {
fn default() -> Self {
Config {
instance: "".to_owned(),
username: "".to_owned(),
password: "".to_owned(),
username: SensitiveString::from("".to_owned()),
password: SensitiveString::from("".to_owned()),
status_post_url: None,
config_reload_seconds: 21600,
protected_communities: vec![],
@ -69,6 +118,162 @@ pub(crate) struct SeriesConfig {
pub(crate) parted: bool,
pub(crate) prepub_community: PostConfig,
pub(crate) volume_community: PostConfig,
pub(crate) fetcher: Fetcher
}
impl SeriesConfig {
pub(crate) async fn update(&self, history: &mut SeriesHistory, lemmy: &Lemmy, config: &Arc<RwLock<Config>>) {
let info_msg = format!("Checking {} for Updates", self.slug);
info!(info_msg);
let mut fetcher: Fetcher = match &self.fetcher {
Fetcher::Jnc(_) => {
Fetcher::Jnc(JNovelFetcher::new())
},
/*default => {
let err_msg = format!("Fetcher {default} not implemented");
error!(err_msg);
return;
}*/
};
match fetcher {
Fetcher::Jnc(ref mut jnc) => {
jnc.set_series(self.slug.clone());
jnc.set_part_option(self.parted);
}
}
let post_list = match fetcher.check_feed().await {
Ok(data) => data,
Err(_) => {
let err_msg = format!("While checking feed for {}", self.slug);
error!(err_msg);
return;
}
};
if post_list.is_empty() && Utc::now().minute() % 10 == 0 {
let info_msg = "No Updates found";
info!(info_msg);
}
for post_info in post_list.iter() {
if history.check_for_post(
self.slug.as_str(),
post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(),
post_info.get_info().title.as_str()
) {
continue
}
let post_data = post_info.get_post_data(self, lemmy);
let info = format!(
"Posting '{}' to {}",
post_info.get_info().title.as_str(),
post_info.get_post_config(self).name.as_str()
);
info!(info);
let post_id = match lemmy.post(post_data).await {
Some(data) => data,
None=> {
error!("Error posting chapter");
return;
}
};
let read_config = config.read().expect("Read Lock Failed").clone();
if post_info.get_post_config(self).pin_settings.pin_new_post_community
&& !read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let info = format!(
"Pinning '{}' to {}",
post_info.get_info().title,
post_info.get_post_config(self).name.as_str()
);
info!(info);
let pinned_posts = lemmy.get_community_pinned(lemmy.get_community_id(&post_info.get_post_config(self).name)).await.unwrap_or_else(|| {
error!("Pinning of Post to community failed");
vec![]
});
if !pinned_posts.is_empty() {
let community_pinned_post = &pinned_posts[0];
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Community).await.is_none() {
error!("Error un-pinning post");
}
}
if lemmy.pin(post_id, PostFeatureType::Community).await.is_none() {
error!("Error pinning post");
}
} else if read_config
.protected_communities
.contains(&post_info.get_post_config(self).name)
{
let message = format!(
"Community '{}' for Series '{}' is protected. Is this intended?",
&post_info.get_post_config(self).name, self.slug
);
warn!(message);
}
if post_info.get_post_config(self).pin_settings.pin_new_post_local {
let info = format!("Pinning '{}' to Instance", post_info.get_info().title);
info!(info);
let pinned_posts = match lemmy.get_local_pinned().await {
Some(data) => {data}
None => {
error!("Error fetching pinned posts");
vec![]
}
};
if !pinned_posts.is_empty() {
for pinned_post in pinned_posts {
if read_config
.protected_communities
.contains(&pinned_post.community.name)
{
continue;
} else {
let community_pinned_post = &pinned_post;
if lemmy.unpin(community_pinned_post.post.id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
continue;
}
break;
}
}
}
if lemmy.pin(post_id, PostFeatureType::Local).await.is_none() {
error!("Error pinning post");
};
}
let mut series_history = history.get_series(self.slug.as_str());
let mut part_history = series_history.get_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str());
match post_info.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => part_history.chapter = post_info.get_info().title,
PostType::Volume => part_history.volume = post_info.get_info().title,
}
}
None => part_history.chapter = post_info.get_info().title,
}
series_history.set_part(post_info.get_part_info().unwrap_or(PartInfo::NoParts).as_string().as_str(), part_history);
history
.set_series(self.slug.as_str(), series_history);
debug!("Saving History");
history.save_history();
}
}
}
#[derive(Debug, Serialize, Deserialize, Clone)]

View file

@ -1,16 +1,13 @@
use crate::{HTTP_CLIENT, lemmy};
use crate::{HTTP_CLIENT};
use chrono::{DateTime, Duration, Utc};
use serde_derive::{Deserialize, Serialize};
use std::cmp::Ordering;
use std::collections::HashMap;
use std::ops::Sub;
use async_trait::async_trait;
use url::Url;
use crate::fetchers::Fetcher;
use crate::fetchers::jnovel::JPostInfo::{Chapter, Volume};
use crate::fetchers::jnovel::PartInfo::{NoParts, Part};
use crate::lemmy::{PostInfo, PostInfoInner};
use crate::fetchers::{FetcherTrait};
use crate::lemmy::{PartInfo, PostInfo, PostInfoInner, PostType};
use systemd_journal_logger::connected_to_journal;
use crate::lemmy::PartInfo::{NoParts, Part};
macro_rules! error {
($msg:tt) => {
@ -34,7 +31,7 @@ static PAST_DAYS_ELIGIBLE: u8 = 4;
macro_rules! api_url {
() => {
"https://labs.j-novel.club/app/v1".to_owned()
"https://labs.j-novel.club/app/v2".to_owned()
};
}
@ -91,191 +88,41 @@ pub(crate) struct ChapterDetail {
pub(crate) cover: Option<Cover>,
}
#[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
Part(number) => *number,
NoParts => 0,
}
}
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone)]
pub(crate) enum JPostInfo {
Chapter {
part: PartInfo,
lemmy_info: PostInfoInner,
},
Volume {
part: PartInfo,
description: String,
lemmy_info: PostInfoInner,
},
}
impl JPostInfo {
pub(crate) fn get_part_info(&self) -> PartInfo {
match self {
Chapter {
part: part_info, ..
} => *part_info,
Volume {
part: part_info, ..
} => *part_info,
}
}
}
impl PostInfo for JPostInfo {
fn get_info(&self) -> PostInfoInner {
match self {
Chapter { lemmy_info, .. } => lemmy_info.clone(),
Volume { lemmy_info, .. } => lemmy_info.clone(),
}
}
fn get_description(&self) -> Option<String> {
match self {
Chapter { .. } => None,
Volume { description, .. } => Some(description.clone()),
}
}
}
impl PartialEq for JPostInfo {
fn eq(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part.eq(other_part)
}
}
impl PartialOrd for JPostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part < other_part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_part = match self {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
let other_part = match other {
Chapter { part, .. } => part,
Volume { part, .. } => part,
};
self_part > other_part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
pub(crate) struct JFetcherOptions {
#[derive(Deserialize, Serialize, Debug, Clone)]
pub(crate) struct JNovelFetcher {
series_slug: String,
series_has_parts: bool
}
impl JFetcherOptions {
pub(crate) fn new(series_slug: String, series_has_parts: bool) -> Self {
JFetcherOptions {
series_slug,
series_has_parts
impl Default for JNovelFetcher {
fn default() -> Self {
Self {
series_slug: "".to_owned(),
series_has_parts: false,
}
}
}
impl JNovelFetcher {
pub(crate) fn set_series(&mut self, series: String) {
self.series_slug = series;
}
pub(crate) fn set_part_option(&mut self, has_parts: bool) {
self.series_has_parts = has_parts;
}
}
#[async_trait]
impl Fetcher for JFetcherOptions {
type Return = JPostInfo;
async fn check_feed(&self) -> Result<Vec<Self::Return>, ()> {
impl FetcherTrait for JNovelFetcher {
fn new() -> Self {
JNovelFetcher {
series_slug: "".to_owned(),
series_has_parts: false
}
}
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
let response = match HTTP_CLIENT
.get(api_url!() + "/series/" + self.series_slug.as_str() + "/volumes?format=json")
.send()
@ -284,7 +131,7 @@ impl Fetcher for JFetcherOptions {
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
let err_msg = format!("While checking feed: {e}");
error!(err_msg);
return Err(());
}
@ -307,8 +154,8 @@ impl Fetcher for JFetcherOptions {
volume_brief_data.volumes.reverse(); // Makes breaking out of the volume loop easier
// If no parts just use 0 as Part indicator as no Series with Parts has a Part 0
let mut volume_map: HashMap<u8, JPostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, JPostInfo> = HashMap::new();
let mut volume_map: HashMap<u8, PostInfo> = HashMap::new();
let mut prepub_map: HashMap<u8, PostInfo> = HashMap::new();
for volume in volume_brief_data.volumes.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&volume.publishing).unwrap();
@ -351,14 +198,16 @@ impl Fetcher for JFetcherOptions {
self.series_slug.as_str(),
volume.number
);
let post_details = lemmy::PostInfoInner {
let post_details = PostInfoInner {
title: volume.title.clone(),
url: Url::parse(&post_url).unwrap(),
url: post_url.clone(),
thumbnail: Some(volume.cover.thumbnail.clone())
};
let new_post_info = Volume {
part: new_part_info,
description: volume.short_description.clone(),
let new_post_info = PostInfo {
post_type: Some(PostType::Volume),
part: Some(new_part_info),
description: Some(volume.short_description.clone()),
lemmy_info: post_details,
};
@ -375,10 +224,12 @@ impl Fetcher for JFetcherOptions {
.or_insert(new_post_info);
}
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await? {
let prepub_post_info = Chapter {
part: new_part_info,
if let Some(prepub_info) = get_latest_prepub(&volume.slug).await {
let prepub_post_info = PostInfo {
post_type: Some(PostType::Chapter),
part: Some(new_part_info),
lemmy_info: prepub_info,
description: None,
};
prepub_map
@ -392,8 +243,8 @@ impl Fetcher for JFetcherOptions {
}
}
let mut result_vec: Vec<JPostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<JPostInfo> = prepub_map.values().cloned().collect();
let mut result_vec: Vec<PostInfo> = volume_map.values().cloned().collect();
let mut prepub_vec: Vec<PostInfo> = prepub_map.values().cloned().collect();
result_vec.append(&mut prepub_vec);
Ok(result_vec)
@ -401,7 +252,7 @@ impl Fetcher for JFetcherOptions {
}
async fn get_latest_prepub(volume_slug: &str) -> Result<Option<lemmy::PostInfoInner>, ()> {
async fn get_latest_prepub(volume_slug: &str) -> Option<PostInfoInner> {
let response = match HTTP_CLIENT
.get(api_url!() + "/volumes/" + volume_slug + "/parts?format=json")
.send()
@ -410,15 +261,15 @@ async fn get_latest_prepub(volume_slug: &str) -> Result<Option<lemmy::PostInfoIn
Ok(data) => match data.text().await {
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
let err_msg = format!("While getting latest PrePub: {e}");
error!(err_msg);
return Err(());
return None;
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
return None;
}
};
@ -427,12 +278,12 @@ async fn get_latest_prepub(volume_slug: &str) -> Result<Option<lemmy::PostInfoIn
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
return None;
}
};
volume_prepub_parts_data.parts.reverse(); // Makes breaking out of the parts loop easier
let mut post_details: Option<lemmy::PostInfoInner> = None;
let mut post_details: Option<PostInfoInner> = None;
for prepub_part in volume_prepub_parts_data.parts.iter() {
let publishing_date = DateTime::parse_from_rfc3339(&prepub_part.launch).unwrap();
@ -442,12 +293,15 @@ async fn get_latest_prepub(volume_slug: &str) -> Result<Option<lemmy::PostInfoIn
continue;
}
let thumbnail = prepub_part.cover.as_ref().map(|cover| cover.thumbnail.clone());
let post_url = format!("{}/read/{}", jnc_base_url!(), prepub_part.slug);
post_details = Some(lemmy::PostInfoInner {
post_details = Some(PostInfoInner {
title: prepub_part.title.clone(),
url: Url::parse(&post_url).unwrap(),
url: post_url.clone(),
thumbnail
});
}
Ok(post_details)
post_details
}

View file

@ -1,9 +1,33 @@
use async_trait::async_trait;
use serde_derive::{Deserialize, Serialize};
use strum_macros::Display;
use crate::fetchers::Fetcher::Jnc;
use crate::fetchers::jnovel::JNovelFetcher;
use crate::lemmy::{PostInfo};
pub mod jnovel;
#[async_trait]
pub(crate) trait Fetcher {
type Return;
async fn check_feed(&self) -> Result<Vec<Self::Return>, ()>;
pub(crate) trait FetcherTrait {
fn new() -> Self where Self: Sized;
async fn check_feed(&self) -> Result<Vec<PostInfo>, ()>;
}
impl Fetcher {
pub(crate) async fn check_feed(&self) -> Result<Vec<PostInfo>, ()> {
match self {
Jnc(fetcher) => fetcher.check_feed().await,
/*default => {
let err_msg = format!("Fetcher {default} is not implemented");
error!(err_msg);
Err(())
}*/
}
}
}
#[derive(Deserialize, Serialize, Debug, Clone, Display)]
pub(crate) enum Fetcher {
#[serde(rename = "jnc")]
Jnc(#[serde(skip)] JNovelFetcher)
}

View file

@ -1,18 +1,28 @@
use crate::config::Config;
use std::cmp::Ordering;
use crate::config::{Config, PostBody, PostConfig, SeriesConfig};
use crate::{HTTP_CLIENT};
use lemmy_api_common::community::{ListCommunities, ListCommunitiesResponse};
use lemmy_api_common::lemmy_db_views::structs::PostView;
use lemmy_api_common::person::{Login, LoginResponse};
use lemmy_api_common::post::{CreatePost, FeaturePost, GetPosts, GetPostsResponse};
use lemmy_api_common::sensitive::Sensitive;
use lemmy_db_schema::newtypes::{CommunityId, PostId};
use lemmy_db_schema::newtypes::{CommunityId, LanguageId, PostId};
use lemmy_db_schema::{ListingType, PostFeatureType};
use reqwest::StatusCode;
use std::collections::HashMap;
use std::sync::{RwLock};
use lemmy_db_schema::sensitive::SensitiveString;
use serde::{Deserialize, Serialize};
use url::Url;
use systemd_journal_logger::connected_to_journal;
macro_rules! debug {
($msg:tt) => {
match connected_to_journal() {
true => log::debug!("[DEBUG] {}", $msg),
false => println!("[DEBUG] {}", $msg),
}
};
}
macro_rules! error {
($msg:tt) => {
match connected_to_journal() {
@ -23,84 +33,197 @@ macro_rules! error {
}
pub(crate) struct Lemmy {
jwt_token: Sensitive<String>,
jwt_token: SensitiveString,
instance: String,
communities: HashMap<String, CommunityId>,
}
#[derive(Debug, Clone)]
pub(crate) struct PostInfoInner {
pub(crate) title: String,
pub(crate) url: Url,
pub(crate) url: String,
pub(crate) thumbnail: Option<String>
}
pub(crate) trait PostInfo {
fn get_info(&self) -> PostInfoInner;
fn get_description(&self) -> Option<String>;
#[derive(Debug, Copy, Clone)]
pub(crate) enum PartInfo {
NoParts,
Part(u8),
}
pub(crate) async fn login(config: &Config) -> Result<Lemmy, ()> {
let login_params = Login {
username_or_email: config.get_username(),
password: config.get_password(),
totp_2fa_token: None,
};
let response = match HTTP_CLIENT
.post(config.instance.to_owned() + "/api/v3/user/login")
.json(&login_params)
.send()
.await
{
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
return Err(());
impl PartInfo {
pub(crate) fn as_u8(&self) -> u8 {
match self {
PartInfo::Part(number) => *number,
PartInfo::NoParts => 0,
}
};
}
match response.status() {
StatusCode::OK => {
let data: LoginResponse = response
.json()
.await
.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy {
jwt_token: token.clone(),
instance: config.instance.to_owned(),
}),
None => {
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned();
error!(err_msg);
Err(())
pub(crate) fn as_string(&self) -> String {
self.as_u8().to_string()
}
}
impl PartialEq for PartInfo {
fn eq(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric == other_numeric
}
}
impl PartialOrd for PartInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric < other_numeric
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
let self_numeric = self.as_u8();
let other_numeric = other.as_u8();
self_numeric > other_numeric
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
#[derive(Debug, Clone, Copy)]
pub(crate) enum PostType {
Chapter,
Volume
}
#[derive(Debug, Clone)]
pub(crate) struct PostInfo {
pub(crate) part: Option<PartInfo>,
pub(crate) lemmy_info: PostInfoInner,
pub(crate) description: Option<String>,
pub(crate) post_type: Option<PostType>
}
impl PostInfo {
pub(crate)fn get_info(&self) -> PostInfoInner {
self.lemmy_info.clone()
}
pub(crate)fn get_description(&self) -> Option<String> {
self.description.clone()
}
pub(crate) fn get_part_info(&self) -> Option<PartInfo> {
self.part
}
pub(crate) fn get_post_config(&self, series: &SeriesConfig) -> PostConfig {
match self.post_type {
Some(post_type) => {
match post_type {
PostType::Chapter => series.prepub_community.clone(),
PostType::Volume => series.volume_community.clone(),
}
}
None => series.prepub_community.clone(),
}
status => {
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
error!(err_msg);
Err(())
}
pub(crate) fn get_post_data(&self, series: &SeriesConfig, lemmy: &Lemmy) -> CreatePost {
let post_config = self.get_post_config(series);
let post_body = match &post_config.post_body {
PostBody::None => None,
PostBody::Description => self.get_description(),
PostBody::Custom(text) => Some(text.clone()),
};
let community_id: CommunityId = lemmy.get_community_id(&post_config.name);
CreatePost {
name: self.get_info().title.clone(),
community_id,
url: Some(self.get_info().url),
custom_thumbnail: self.get_info().thumbnail,
body: post_body,
alt_text: None,
honeypot: None,
nsfw: None,
language_id: Some(LanguageId(37)), // TODO get this id once every few hours per API request, the ordering of IDs suggests that the EN Id might change in the future
}
}
}
impl PartialEq for PostInfo {
fn eq(&self, other: &Self) -> bool {
self.part.eq(&other.part)
}
}
impl PartialOrd for PostInfo {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
if self.gt(other) {
Some(Ordering::Greater)
} else if self.eq(other) {
Some(Ordering::Equal)
} else {
Some(Ordering::Less)
}
}
fn lt(&self, other: &Self) -> bool {
self.part < other.part
}
fn le(&self, other: &Self) -> bool {
!self.gt(other)
}
fn gt(&self, other: &Self) -> bool {
self.part > other.part
}
fn ge(&self, other: &Self) -> bool {
!self.lt(other)
}
}
impl Lemmy {
pub(crate) async fn post(&self, post: CreatePost) -> Result<PostId, ()> {
let response = self.fetch_data_json("/api/v3/post", &post).await?;
let json_data: PostView = self.parse_json(&response).await?;
Ok(json_data.post.id)
pub(crate) fn get_community_id(&self, name: &str) -> CommunityId {
*self.communities.get(name).expect("Given community is invalid")
}
pub(crate) async fn new(config: &RwLock<Config>) -> Result<Self, ()> {
let read_config = config.read().expect("Read Lock Failed").clone();
let login_params = Login {
username_or_email: read_config.get_username(),
password: read_config.get_password(),
totp_2fa_token: None,
};
async fn feature(&self, params: FeaturePost) -> Result<PostView, ()> {
let response = self.fetch_data_json("/api/v3/post/feature", &params).await?;
let json_data = match serde_json::from_str::<HashMap<&str, PostView>>(&response) {
Ok(mut data) => data.remove("post_view").expect("Element should be present"),
let response = match HTTP_CLIENT
.post(read_config.instance.to_owned() + "/api/v3/user/login")
.json(&login_params)
.send()
.await
{
Ok(data) => data,
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
@ -108,10 +231,65 @@ impl Lemmy {
}
};
Ok(json_data)
match response.status() {
StatusCode::OK => {
let data: LoginResponse = response
.json()
.await
.expect("Successful Login Request should return JSON");
match data.jwt {
Some(token) => Ok(Lemmy {
jwt_token: token.clone(),
instance: read_config.instance.to_owned(),
communities: HashMap::new(),
}),
None => {
let err_msg = "Login did not return JWT token. Are the credentials valid?".to_owned();
error!(err_msg);
Err(())
}
}
}
status => {
let err_msg = format!("Unexpected HTTP Status '{}' during Login", status);
error!(err_msg);
Err(())
}
}
}
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> {
pub(crate) async fn logout(&self) {
let _ = self.post_data_json("/api/v3/user/logout", &"").await;
}
pub(crate) async fn post(&self, post: CreatePost) -> Option<PostId> {
let response: String = match self.post_data_json("/api/v3/post", &post).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data.post.id)
}
async fn feature(&self, params: FeaturePost) -> Option<PostView> {
let response: String = match self.post_data_json("/api/v3/post/feature", &params).await {
Some(data) => data,
None => return None,
};
let json_data: PostView = match self.parse_json_map(&response).await {
Some(data) => data,
None => return None,
};
Some(json_data)
}
pub(crate) async fn unpin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost {
post_id,
featured: false,
@ -120,7 +298,7 @@ impl Lemmy {
self.feature(pin_params).await
}
pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Result<PostView, ()> {
pub(crate) async fn pin(&self, post_id: PostId, location: PostFeatureType) -> Option<PostView> {
let pin_params = FeaturePost {
post_id,
featured: true,
@ -129,18 +307,23 @@ impl Lemmy {
self.feature(pin_params).await
}
pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Result<Vec<PostView>, ()> {
pub(crate) async fn get_community_pinned(&self, community: CommunityId) -> Option<Vec<PostView>> {
let list_params = GetPosts {
community_id: Some(community),
type_: Some(ListingType::Local),
..Default::default()
};
let response = self.fetch_data_query("/api/v3/post/list", &list_params).await?;
let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match self.parse_json(&response).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = self.parse_json(&response).await?;
Ok(json_data
Some(json_data
.posts
.iter()
.filter(|post| post.post.featured_community)
@ -148,17 +331,22 @@ impl Lemmy {
.collect())
}
pub(crate) async fn get_local_pinned(&self) -> Result<Vec<PostView>, ()> {
pub(crate) async fn get_local_pinned(&self) -> Option<Vec<PostView>> {
let list_params = GetPosts {
type_: Some(ListingType::Local),
..Default::default()
};
let response = self.fetch_data_query("/api/v3/post/list", &list_params).await?;
let response: String = match self.get_data_query("/api/v3/post/list", &list_params).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = match self.parse_json(&response).await {
Some(data) => data,
None => return None,
};
let json_data: GetPostsResponse = self.parse_json(&response).await?;
Ok(json_data
Some(json_data
.posts
.iter()
.filter(|post| post.post.featured_local)
@ -166,15 +354,20 @@ impl Lemmy {
.collect())
}
pub(crate) async fn get_communities(&self) -> Result<HashMap<String, CommunityId>, ()> {
pub(crate) async fn get_communities(&mut self) {
let list_params = ListCommunities {
type_: Some(ListingType::Local),
..Default::default()
};
let response = self.fetch_data_query("/api/v3/community/list", &list_params).await?;
let json_data: ListCommunitiesResponse = self.parse_json(&response).await?;
let response: String = match self.get_data_query("/api/v3/community/list", &list_params).await {
Some(data) => data,
None => return,
};
let json_data: ListCommunitiesResponse = match self.parse_json::<ListCommunitiesResponse>(&response).await {
Some(data) => data,
None => return,
};
let mut communities: HashMap<String, CommunityId> = HashMap::new();
for community_view in json_data.communities {
@ -182,12 +375,12 @@ impl Lemmy {
communities.insert(community.name, community.id);
}
Ok(communities)
self.communities = communities;
}
async fn fetch_data_json<T: Serialize>(&self, route: &str, json: &T ) -> Result<String,()> {
async fn post_data_json<T: Serialize>(&self, route: &str, json: &T ) -> Option<String> {
let res = HTTP_CLIENT
.post(format!("{}{route}", self.instance))
.post(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.json(&json)
.send()
@ -195,41 +388,62 @@ impl Lemmy {
self.extract_data(res).await
}
async fn fetch_data_query<T: Serialize>(&self, route: &str, json: &T ) -> Result<String,()> {
async fn get_data_query<T: Serialize>(&self, route: &str, param: &T ) -> Option<String> {
let res = HTTP_CLIENT
.post(format!("{}{route}", self.instance))
.get(format!("{}{route}", &self.instance))
.bearer_auth(&self.jwt_token.to_string())
.query(&json)
.query(&param)
.send()
.await;
self.extract_data(res).await
}
async fn extract_data(&self, response: Result<reqwest::Response, reqwest::Error>) -> Result<String,()> {
async fn extract_data(&self, response: Result<reqwest::Response, reqwest::Error>) -> Option<String> {
match response {
Ok(data) => match data.text().await {
Ok(data) => Ok(data),
Err(e) => {
let err_msg = format!("{e}");
Ok(data) => {
if data.status().is_success() {
match data.text().await {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
None
}
}
}
else {
let err_msg = format!("HTTP Request failed: {}", data.text().await.unwrap());
error!(err_msg);
Err(())
None
}
},
Err(e) => {
let err_msg = format!("{e}");
error!(err_msg);
Err(())
None
}
}
}
async fn parse_json<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Result<T,()> {
match serde_json::from_str::<HashMap<&str, T>>(response) {
Ok(mut data) => Ok(data.remove("post_view").expect("Element should be present")),
async fn parse_json<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
match serde_json::from_str::<T>(response) {
Ok(data) => Some(data),
Err(e) => {
let err_msg = format!("{e}");
let err_msg = format!("while parsing JSON: {e} ");
error!(err_msg);
Err(())
None
}
}
}
async fn parse_json_map<'a, T: Deserialize<'a>>(&self, response: &'a str) -> Option<T> {
debug!(response);
match serde_json::from_str::<HashMap<&str, T>>(response) {
Ok(mut data) => Some(data.remove("post_view").expect("Element should be present")),
Err(e) => {
let err_msg = format!("while parsing JSON HashMap: {e}");
error!(err_msg);
None
}
}
}

View file

@ -3,6 +3,7 @@ use log::{LevelFilter};
use once_cell::sync::Lazy;
use reqwest::Client;
use systemd_journal_logger::{JournalLog};
use crate::bot::Bot;
mod bot;
mod config;
@ -12,8 +13,8 @@ mod fetchers;
pub static HTTP_CLIENT: Lazy<Client> = Lazy::new(|| {
Client::builder()
.timeout(Duration::seconds(30).to_std().unwrap())
.connect_timeout(Duration::seconds(30).to_std().unwrap())
.timeout(Duration::seconds(10).to_std().unwrap())
.connect_timeout(Duration::seconds(10).to_std().unwrap())
.build()
.expect("build client")
});
@ -24,6 +25,17 @@ async fn main() {
.expect("Systemd-Logger crate error")
.install()
.expect("Systemd-Logger crate error");
log::set_max_level(LevelFilter::Info);
bot::run().await;
match std::env::var("LOG_LEVEL") {
Ok(level) => {
match level.as_str() {
"debug" => log::set_max_level(LevelFilter::Debug),
"info" => log::set_max_level(LevelFilter::Info),
_ => log::set_max_level(LevelFilter::Info),
}
}
_ => log::set_max_level(LevelFilter::Info),
}
let mut bot = Bot::new();
bot.run().await;
}