Skip to content

Commit 10c8c2e

Browse files
answered 2 python exercises
1 parent 04854fa commit 10c8c2e

File tree

2 files changed

+23
-11
lines changed

2 files changed

+23
-11
lines changed

Sprint-1/Python/has_pair_with_sum/has_pair_with_sum.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,4 +42,6 @@ def has_pair_with_sum(numbers: List[Number], target_sum: Number) -> bool:
4242
O(N)(Linear Space). We introduce the seen_numbers set, which, in the worst case,
4343
will store up to N elements from the input list.
4444
45+
Resources: https://www.w3schools.com/python/ref_set_intersection.asp
46+
4547
'''
Lines changed: 21 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,35 @@
1-
from typing import List, Sequence, TypeVar
1+
from typing import List, Sequence, TypeVar, Set
22

33
ItemType = TypeVar("ItemType")
44

55

66
def remove_duplicates(values: Sequence[ItemType]) -> List[ItemType]:
77
"""
88
Remove duplicate values from a sequence, preserving the order of the first occurrence of each value.
9+
Refactored to use a set for O(1) average time lookups, achieving O(N) overall time complexity.
910
10-
Time complexity:
11-
Space complexity:
12-
Optimal time complexity:
11+
Time complexity: O(N)
12+
Space complexity: O(N)
13+
Optimal time complexity: O(N)
1314
"""
14-
unique_items = []
15+
seen: Set[ItemType] = set()
16+
unique_items: List[ItemType] = []
1517

1618
for value in values:
17-
is_duplicate = False
18-
for existing in unique_items:
19-
if value == existing:
20-
is_duplicate = True
21-
break
22-
if not is_duplicate:
19+
if value not in seen:
20+
seen.add(value)
2321
unique_items.append(value)
2422

2523
return unique_items
24+
25+
"""
26+
Explanation:
27+
The inner loop performs a linear scan through the unique_items list to check for duplicates, which is the source of the high time complexity.
28+
29+
The only effective way to reduce O(N^2) complexity for this problem is to replace the
30+
linear search (for existing in unique_items) with an O(1) average time lookup,
31+
which requires a hash set (set in Python).The optimal approach is to use a set to track seen items, achieving O(N)$ time complexity.
32+
33+
Resource: https://www.w3schools.com/python/ref_set_intersection.asp
34+
35+
"""

0 commit comments

Comments
 (0)