Skip to content

New Config: de-duplicate setters/getters properties #90

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
blindspotbounty opened this issue Jul 17, 2023 · 8 comments
Closed

New Config: de-duplicate setters/getters properties #90

blindspotbounty opened this issue Jul 17, 2023 · 8 comments

Comments

@blindspotbounty
Copy link
Collaborator

As I see, there are a lot of properties duplicated in Producer and Consumer configurations.

As example:

./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["receive.message.max.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["max.in.flight.requests.per.connection"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["metadata.max.age.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.interval.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.fast.interval.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.sparse"] = newValue.description }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["topic.metadata.propagation.max.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["topic.blacklist"] = newValue.joined(separator: ",") }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.timeout.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.send.buffer.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.receive.buffer.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.keepalive.enable"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.nagle.disable"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.max.fails"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift:        set { self.dictionary["socket.connection.setup.timeout.ms"] = String(newValue) }

and

./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["receive.message.max.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["max.in.flight.requests.per.connection"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["metadata.max.age.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.interval.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.fast.interval.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["topic.metadata.refresh.sparse"] = newValue.description }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["topic.metadata.propagation.max.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["topic.blacklist"] = newValue.joined(separator: ",") }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.timeout.ms"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.send.buffer.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.receive.buffer.bytes"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.keepalive.enable"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.nagle.disable"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.max.fails"] = String(newValue) }
./Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift:        set { self.dictionary["socket.connection.setup.timeout.ms"] = String(newValue) }

Probably, it make sense to put them into one place and de-dup this code or I miss something and that was made intentionally.

@felixschlegel could you suggest on that, please?

@felixschlegel
Copy link
Contributor

Good catch 😄 , this was intentional indeed because we wanted to have lightweight structs for our configurations without adding any new types to our public API that just serve code de-duplication internally. Did you have anything particular in mind to achieve deduplication here?

@blindspotbounty
Copy link
Collaborator Author

  1. I was thinking of some protocol with extensions, e.g.:
public protocol KafkaConfig {
    var dictionary: [String: String] { get set }
    
    var socketTimeoutMs: UInt { get set }
}


public extension KafkaConfig {
    var socketTimeoutMs: UInt {
        get { self.dictionary.getUInt("socket.timeout.ms") ?? 60000 }
        set { self.dictionary["socket.timeout.ms"] = String(newValue) }
    }
}

public struct KafkaProducerConfiguration : KafkaConfig {...}
  1. some struct that would be similar to Consumer/Producer configuration + computed property:
public struct KafkaSharedConfig {
    public var dictionary: [String: String] = [:]
}

public struct KafkaProducerConfiguration {
    var sharedSettings = KafkaSharedConfig()
    var producerDictionary: [String: String] = [:]

    var dictionary: [String: String] { producerDictionary + sharedSettings.dictionary }
}

@felixschlegel
Copy link
Contributor

Both approaches are very reasonable but clash with the idea of us not wanting to expose an extra type for the mere purpose of code deduplication. Still thank you for the effort and for bringing this up! 😄

@blindspotbounty
Copy link
Collaborator Author

Could you advise what is the reason for avoiding extra type, please?

Btw, other possibility which is not available until approx. September is to use new swift macros (since 5.9)

@felixschlegel
Copy link
Contributor

Generally, there is nothing speaking against this extra type. However, it just clutters our public API and we can never remove the extra type as adopters will most likely use this extra type in their codebase.

Ultimately it comes down to if you're happy to live with some duplication or if you want to add a new type to the public API that can never be removed. When implementing the configuration back then we decided to go with the former.

Btw, initially, I also wanted to use protocol extension but then we decided against it (See this PR)

And yes Swift Macros are great! However, we will probably want to support older Swift versions for still some time after September but for the future, the usage of Macros in our package is definitely worth investigating!

@FranzBusch
Copy link
Contributor

As @felixschlegel wrote this was an intentional move to duplicate some code. Introducing a public protocol that just reduced code duplication on our side brings the adopter of this library no benefit and just makes our lives harder in the future when we want to evolve the API.

Reducing code duplication is not always the goal when designing public APIs, but rather having correct and evolvable types. In the end, we just agreed that we can live with the duplication and avoid unnecessary protocols.

@blindspotbounty
Copy link
Collaborator Author

blindspotbounty commented Jul 18, 2023

De-dup of the code is not the goal itself. I am more concerned about possible losing some code in one of places when change it or add some thresholds and checks.
I like the way it is done with the latest changes for security part:

        // Merge with SecurityProtocol configuration dictionary
        resultDict.merge(securityProtocol.dictionary) { _, _ in
            fatalError("securityProtocol and \(#file) should not have duplicate keys")
        }

https://github.com/swift-server/swift-kafka-gsoc/blob/main/Sources/SwiftKafka/Configuration/KafkaProducerConfiguration.swift#L134
https://github.com/swift-server/swift-kafka-gsoc/blob/main/Sources/SwiftKafka/Configuration/KafkaConsumerConfiguration.swift#L167

Security is defined within one place. So, any changes will be reflected for both Consumer and Producer configs.

@blindspotbounty
Copy link
Collaborator Author

@felixschlegel, @FranzBusch thank you for your answers!
I believe that the question is closed now :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants