Skip to content

Bump Rust Version #1383

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 21, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@
name = "parseable"
version = "2.3.5"
authors = ["Parseable Team <[email protected]>"]
edition = "2021"
rust-version = "1.83.0"
edition = "2024"
rust-version = "1.88.0"
Comment on lines +5 to +6
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Critical: Edition 2024 is not stabilized and causes build failures.

The pipeline failure indicates that edition2024 is not stabilized in Cargo version 1.84.0. This will break builds in environments without the latest nightly Rust toolchain.

Consider using edition = "2021" until edition 2024 is stabilized:

-edition = "2024"
-rust-version = "1.88.0"
+edition = "2021"
+rust-version = "1.83.0"

Alternatively, if you need features from edition 2024, ensure all CI/CD environments support it and document this requirement.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
edition = "2024"
rust-version = "1.88.0"
edition = "2021"
rust-version = "1.83.0"
🤖 Prompt for AI Agents
In Cargo.toml at lines 5 to 6, the edition is set to "2024" which is not
stabilized and causes build failures. Change the edition value from "2024" to
"2021" to ensure compatibility with stable Rust toolchains and avoid pipeline
failures. If edition 2024 features are necessary, update all CI/CD environments
to use a nightly Rust toolchain that supports edition 2024 and document this
requirement clearly.

categories = ["logs", "observability", "metrics", "traces"]
build = "build.rs"

Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.

# build stage
FROM rust:1.84.0-bookworm AS builder
FROM rust:1.88.0-bookworm AS builder

LABEL org.opencontainers.image.title="Parseable"
LABEL maintainer="Parseable Team <[email protected]>"
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile.debug
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.

# build stage
FROM docker.io/rust:1.84.0-bookworm AS builder
FROM docker.io/rust:1.88.0-bookworm AS builder

LABEL org.opencontainers.image.title="Parseable"
LABEL maintainer="Parseable Team <[email protected]>"
Expand Down
2 changes: 1 addition & 1 deletion build.rs
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ pub fn main() -> Result<()> {

mod ui {

use std::fs::{self, create_dir_all, OpenOptions};
use std::fs::{self, OpenOptions, create_dir_all};
use std::io::{self, Cursor, Read, Write};
use std::path::{Path, PathBuf};
use std::{env, panic};
Expand Down
20 changes: 11 additions & 9 deletions src/alerts/alerts_utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -27,21 +27,21 @@ use datafusion::{
sum::sum,
},
logical_expr::{BinaryExpr, Literal, Operator},
prelude::{col, lit, DataFrame, Expr},
prelude::{DataFrame, Expr, col, lit},
};
use tracing::trace;

use crate::{
alerts::LogicalOperator,
handlers::http::query::{create_streams_for_distributed, update_schema_when_distributed},
parseable::PARSEABLE,
query::{resolve_stream_names, QUERY_SESSION},
query::{QUERY_SESSION, resolve_stream_names},
utils::time::TimeRange,
};

use super::{
AggregateConfig, AggregateFunction, AggregateResult, Aggregates, AlertConfig, AlertError,
AlertOperator, AlertState, ConditionConfig, Conditions, WhereConfigOperator, ALERTS,
ALERTS, AggregateConfig, AggregateFunction, AggregateResult, Aggregates, AlertConfig,
AlertError, AlertOperator, AlertState, ConditionConfig, Conditions, WhereConfigOperator,
};

/// accept the alert
Expand Down Expand Up @@ -473,9 +473,7 @@ fn match_alert_operator(expr: &ConditionConfig) -> Expr {
WhereConfigOperator::LessThanOrEqual => col(column).lt_eq(lit(value)),
WhereConfigOperator::GreaterThanOrEqual => col(column).gt_eq(lit(value)),
WhereConfigOperator::ILike => col(column).ilike(lit(string_value)),
WhereConfigOperator::Contains => {
col(column).like(lit(format!("%{string_value}%")))
},
WhereConfigOperator::Contains => col(column).like(lit(format!("%{string_value}%"))),
WhereConfigOperator::BeginsWith => Expr::BinaryExpr(BinaryExpr::new(
Box::new(col(column)),
Operator::RegexIMatch,
Expand All @@ -497,15 +495,19 @@ fn match_alert_operator(expr: &ConditionConfig) -> Expr {
Operator::RegexNotIMatch,
Box::new(lit(format!("{string_value}$"))),
)),
_ => unreachable!("value must not be null for operators other than `is null` and `is not null`. Should've been caught in validation")
_ => unreachable!(
"value must not be null for operators other than `is null` and `is not null`. Should've been caught in validation"
),
}
} else {
// for maintaining column case
let column = format!(r#""{}""#, expr.column);
match expr.operator {
WhereConfigOperator::IsNull => col(column).is_null(),
WhereConfigOperator::IsNotNull => col(column).is_not_null(),
_ => unreachable!("value must be null for `is null` and `is not null`. Should've been caught in validation")
_ => unreachable!(
"value must be null for `is null` and `is not null`. Should've been caught in validation"
),
}
}
}
Expand Down
6 changes: 3 additions & 3 deletions src/alerts/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,8 @@ use actix_web::http::header::ContentType;
use async_trait::async_trait;
use chrono::Utc;
use datafusion::sql::sqlparser::parser::ParserError;
use derive_more::derive::FromStr;
use derive_more::FromStrError;
use derive_more::derive::FromStr;
use http::StatusCode;
use once_cell::sync::Lazy;
use serde::Serialize;
Expand All @@ -31,7 +31,7 @@ use std::fmt::{self, Display};
use std::thread;
use std::time::Duration;
use tokio::sync::oneshot::{Receiver, Sender};
use tokio::sync::{mpsc, RwLock};
use tokio::sync::{RwLock, mpsc};
use tokio::task::JoinHandle;
use tracing::{error, trace, warn};
use ulid::Ulid;
Expand All @@ -40,7 +40,7 @@ pub mod alerts_utils;
pub mod target;

use crate::alerts::target::TARGETS;
use crate::parseable::{StreamNotFound, PARSEABLE};
use crate::parseable::{PARSEABLE, StreamNotFound};
use crate::rbac::map::SessionKey;
use crate::storage;
use crate::storage::ObjectStorageError;
Expand Down
12 changes: 8 additions & 4 deletions src/alerts/target.rs
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,11 @@ use async_trait::async_trait;
use base64::Engine;
use bytes::Bytes;
use chrono::Utc;
use http::{header::AUTHORIZATION, HeaderMap, HeaderValue};
use http::{HeaderMap, HeaderValue, header::AUTHORIZATION};
use itertools::Itertools;
use once_cell::sync::Lazy;
use reqwest::ClientBuilder;
use serde_json::{json, Value};
use serde_json::{Value, json};
use tokio::sync::RwLock;
use tracing::{error, trace, warn};
use ulid::Ulid;
Expand Down Expand Up @@ -288,7 +288,9 @@ impl Target {
state
} else {
*state.lock().unwrap() = TimeoutState::default();
warn!("Unable to fetch state for given alert_id- {alert_id}, stopping target notifs");
warn!(
"Unable to fetch state for given alert_id- {alert_id}, stopping target notifs"
);
return;
};

Expand All @@ -304,7 +306,9 @@ impl Target {
state
} else {
*state.lock().unwrap() = TimeoutState::default();
warn!("Unable to fetch state for given alert_id- {alert_id}, stopping target notifs");
warn!(
"Unable to fetch state for given alert_id- {alert_id}, stopping target notifs"
);
return;
};

Expand Down
11 changes: 6 additions & 5 deletions src/analytics.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
*
*
*/
use actix_web::{web, HttpRequest, Responder};
use actix_web::{HttpRequest, Responder, web};
use chrono::{DateTime, Utc};
use clokwerk::{AsyncScheduler, Interval};
use http::header;
Expand All @@ -31,19 +31,20 @@ use tracing::{error, info};
use ulid::Ulid;

use crate::{
HTTP_CLIENT, INTRA_CLUSTER_CLIENT,
about::{current, platform},
handlers::{
STREAM_NAME_HEADER_KEY,
http::{
base_path_without_preceding_slash,
cluster::{self, utils::check_liveness},
modal::{NodeMetadata, NodeType},
},
STREAM_NAME_HEADER_KEY,
},
option::Mode,
parseable::PARSEABLE,
stats::{self, Stats},
storage, HTTP_CLIENT, INTRA_CLUSTER_CLIENT,
storage,
};

const ANALYTICS_SERVER_URL: &str = "https://analytics.parseable.io:80";
Expand Down Expand Up @@ -239,8 +240,8 @@ fn total_event_stats() -> (Stats, Stats, Stats) {
)
}

async fn fetch_ingestors_metrics(
) -> anyhow::Result<(u64, u64, usize, u64, u64, u64, u64, u64, u64, u64, u64, u64)> {
async fn fetch_ingestors_metrics()
-> anyhow::Result<(u64, u64, usize, u64, u64, u64, u64, u64, u64, u64, u64, u64)> {
let event_stats = total_event_stats();
let mut node_metrics = NodeMetrics::new(
total_streams(),
Expand Down
4 changes: 2 additions & 2 deletions src/audit.rs
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@ use std::{
fmt::{Debug, Display},
};

use crate::{about::current, parseable::PARSEABLE, storage::StorageMetadata, HTTP_CLIENT};
use crate::{HTTP_CLIENT, about::current, parseable::PARSEABLE, storage::StorageMetadata};

use chrono::{DateTime, Utc};
use once_cell::sync::Lazy;
use serde::Serialize;
use serde_json::{json, Value};
use serde_json::{Value, json};
use tracing::error;

use ulid::Ulid;
Expand Down
2 changes: 1 addition & 1 deletion src/catalog/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ use crate::{
query::PartialTimeFilter,
stats::{event_labels_date, get_current_stats, storage_size_labels_date, update_deleted_stats},
storage::{
object_storage::manifest_path, ObjectStorage, ObjectStorageError, ObjectStoreFormat,
ObjectStorage, ObjectStorageError, ObjectStoreFormat, object_storage::manifest_path,
},
};
pub use manifest::create_from_parquet_file;
Expand Down
2 changes: 1 addition & 1 deletion src/cli.rs
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ use crate::connectors::kafka::config::KafkaConfig;

use crate::{
oidc::{self, OpenidConfig},
option::{validation, Compression, Mode},
option::{Compression, Mode, validation},
storage::{AzureBlobConfig, FSConfig, GcsConfig, S3Config},
};

Expand Down
8 changes: 4 additions & 4 deletions src/connectors/kafka/consumer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,20 @@
*
*/

use crate::connectors::common::shutdown::Shutdown;
use crate::connectors::common::ConnectorError;
use crate::connectors::common::shutdown::Shutdown;
use crate::connectors::kafka::partition_stream::{PartitionStreamReceiver, PartitionStreamSender};
use crate::connectors::kafka::state::StreamState;
use crate::connectors::kafka::{
partition_stream, ConsumerRecord, KafkaContext, StreamConsumer, TopicPartition,
ConsumerRecord, KafkaContext, StreamConsumer, TopicPartition, partition_stream,
};
use futures_util::FutureExt;
use rdkafka::Statistics;
use rdkafka::consumer::Consumer;
use rdkafka::message::BorrowedMessage;
use rdkafka::Statistics;
use std::sync::Arc;
use std::time::Duration;
use tokio::sync::{mpsc, RwLock};
use tokio::sync::{RwLock, mpsc};
use tokio_stream::wrappers::ReceiverStream;
use tracing::{error, info, warn};

Expand Down
4 changes: 2 additions & 2 deletions src/connectors/kafka/metrics.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@

use prometheus::core::{Collector, Desc};
use prometheus::{
proto, Histogram, HistogramOpts, HistogramVec, IntCounter, IntCounterVec, IntGauge,
IntGaugeVec, Opts,
Histogram, HistogramOpts, HistogramVec, IntCounter, IntCounterVec, IntGauge, IntGaugeVec, Opts,
proto,
};
use rdkafka::Statistics;
use std::sync::{Arc, RwLock};
Expand Down
2 changes: 1 addition & 1 deletion src/connectors/kafka/partition_stream.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

use crate::connectors::kafka::{ConsumerRecord, TopicPartition};
use std::sync::Arc;
use tokio::sync::{mpsc, Notify};
use tokio::sync::{Notify, mpsc};
use tokio_stream::wrappers::ReceiverStream;
use tracing::{error, info};

Expand Down
4 changes: 2 additions & 2 deletions src/connectors/kafka/processor.rs
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ use tracing::{debug, error};
use crate::{
connectors::common::processor::Processor,
event::{
format::{json, EventFormat, LogSourceEntry},
Event as ParseableEvent, USER_AGENT_KEY,
format::{EventFormat, LogSourceEntry, json},
},
parseable::PARSEABLE,
storage::StreamType,
};

use super::{config::BufferConfig, ConsumerRecord, StreamConsumer, TopicPartition};
use super::{ConsumerRecord, StreamConsumer, TopicPartition, config::BufferConfig};

#[derive(Default, Debug, Clone)]
pub struct ParseableSinkProcessor;
Expand Down
2 changes: 1 addition & 1 deletion src/connectors/kafka/rebalance_listener.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@
*/

use crate::connectors::common::shutdown::Shutdown;
use crate::connectors::kafka::state::StreamState;
use crate::connectors::kafka::RebalanceEvent;
use crate::connectors::kafka::state::StreamState;
use std::sync::Arc;
use tokio::sync::RwLock;
use tokio::{runtime::Handle, sync::mpsc::Receiver};
Expand Down
2 changes: 1 addition & 1 deletion src/connectors/kafka/sink.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@
*/
use crate::connectors::common::build_runtime;
use crate::connectors::common::processor::Processor;
use crate::connectors::kafka::ConsumerRecord;
use crate::connectors::kafka::consumer::KafkaStreams;
use crate::connectors::kafka::processor::StreamWorker;
use crate::connectors::kafka::ConsumerRecord;
use anyhow::Result;
use futures_util::StreamExt;
use rdkafka::consumer::Consumer;
Expand Down
6 changes: 3 additions & 3 deletions src/connectors/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ use std::sync::Arc;
use actix_web_prometheus::PrometheusMetrics;
use common::{processor::Processor, shutdown::Shutdown};
use kafka::{
config::KafkaConfig, consumer::KafkaStreams, metrics::KafkaMetricsCollector,
processor::ParseableSinkProcessor, rebalance_listener::RebalanceListener,
sink::KafkaSinkConnector, state::StreamState, ConsumerRecord, KafkaContext,
ConsumerRecord, KafkaContext, config::KafkaConfig, consumer::KafkaStreams,
metrics::KafkaMetricsCollector, processor::ParseableSinkProcessor,
rebalance_listener::RebalanceListener, sink::KafkaSinkConnector, state::StreamState,
};
use prometheus::Registry;
use tokio::sync::RwLock;
Expand Down
4 changes: 2 additions & 2 deletions src/correlation.rs
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

use std::collections::{HashMap, HashSet};

use actix_web::{http::header::ContentType, Error};
use actix_web::{Error, http::header::ContentType};
use chrono::Utc;
use datafusion::error::DataFusionError;
use http::StatusCode;
Expand All @@ -37,7 +37,7 @@ use crate::{
},
parseable::PARSEABLE,
query::QUERY_SESSION,
rbac::{map::SessionKey, Users},
rbac::{Users, map::SessionKey},
storage::ObjectStorageError,
users::filters::FilterQuery,
utils::{get_hash, user_auth_for_datasets},
Expand Down
5 changes: 3 additions & 2 deletions src/enterprise/utils.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,13 @@ use relative_path::RelativePathBuf;
use crate::query::stream_schema_provider::extract_primary_filter;
use crate::{
catalog::{
Snapshot,
manifest::{File, Manifest},
snapshot, Snapshot,
snapshot,
},
event,
parseable::PARSEABLE,
query::{stream_schema_provider::ManifestExt, PartialTimeFilter},
query::{PartialTimeFilter, stream_schema_provider::ManifestExt},
storage::{ObjectStorage, ObjectStorageError, ObjectStoreFormat, STREAM_ROOT_DIRECTORY},
utils::time::TimeRange,
};
Expand Down
2 changes: 1 addition & 1 deletion src/event/format/json.rs
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@

use anyhow::anyhow;
use arrow_array::RecordBatch;
use arrow_json::reader::{infer_json_schema_from_iterator, ReaderBuilder};
use arrow_json::reader::{ReaderBuilder, infer_json_schema_from_iterator};
use arrow_schema::{DataType, Field, Fields, Schema};
use chrono::{DateTime, NaiveDate, NaiveDateTime, Utc};
use datafusion::arrow::util::bit_util::round_upto_multiple_of_64;
Expand Down
Loading
Loading