Skip to content

Commit 8d3e202

Browse files
authored
Merge pull request #233 from isangwanrahul/main
updates for adding some more links
2 parents 336d8ea + 711f302 commit 8d3e202

File tree

2 files changed

+48
-41
lines changed

2 files changed

+48
-41
lines changed

bootcamp/introduction.md

Lines changed: 34 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22

33
This will be six weeks of curricula
44

5+
- Bootcamp Database Setup is [here](https://www.dataexpert.io/lesson/boot-camp-database-setup-yt)
56
- Dimensional Data Modeling
67
- Homework is [here](materials/1-dimensional-data-modeling/homework/homework.md)
78
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/dimensional-data-modeling-lecture-day-1-yt)
@@ -11,12 +12,44 @@ This will be six weeks of curricula
1112
- Day 3 Lecture is [here](https://www.dataexpert.io/lesson/dimensional-data-modeling-day-3-lecture-yt)
1213
- Day 3 Lab is [here](https://www.dataexpert.io/lesson/dimensional-data-modeling-day-3-lab-yt)
1314
- Fact Data Modeling
14-
- Homework is (to be added)
15+
- Homework is [here](materials/2-fact-data-modeling/homework/homework.md)
16+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-1-lecture-yt)
17+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-1-lab-yt)
18+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-2-lecture-yt)
19+
- Day 2 Lab is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-2-lab-yt)
20+
- Day 3 Lecture is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-3-lecture-yt)
21+
- Day 3 Lab is [here](https://www.dataexpert.io/lesson/fact-data-modeling-day-3-lab-yt)
1522
- Data Quality (analytics)
1623
- Data Quality (infrastructure)
1724
- Data impact and visualization (analytics)
1825
- Data pipeline maintenance (infrastructure)
26+
- Homework is [here](materials/6-data-pipeline-maintenance/homework/homework.md)
27+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/data-pipeline-maintenance-day-1-lecture-yt)
28+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/data-pipeline-maintenance-day-1-lab-yt)
29+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/data-pipeline-maintenance-day-2-lecture-yt)
1930
- Applying Analytical Patterns (analytics)
31+
- Homework is [here](materials/4-applying-analytical-patterns/homework/homework.md)
32+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/data-quality-patterns-day-1-lecture-yt)
33+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/data-quality-patterns-day-1-lab-yt)
34+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/data-quality-patterns-day-2-lecture-yt)
2035
- Real-time Pipelines with Flink and Kafka (infrastructure)
36+
- Flink setup is [here](https://www.dataexpert.io/lesson/flink-lab-setup-yt)
37+
- Homework is [here](materials/4-apache-flink-training/homework/homework.md)
38+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/streaming-pipelines-day-1-lecture-yt)
39+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/streaming-pipelines-day-1-lab-yt)
40+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/streaming-pipelines-day-2-lecture-yt)
41+
- Day 2 Lab is [here](https://www.dataexpert.io/lesson/streaming-pipelines-day-2-lab-yt)
2142
- KPIs and Experimentation (analytics)
43+
- Homework is [here](materials/5-kpis-and-experimentation/homework/homework.md)
44+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/kpis-and-experimentation-day-1-lecture-yt)
45+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/kpis-and-experimentation-day-1-lab-yt)
46+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/kpis-and-experimentation-day-2-lecture-yt)
2247
- Apache Spark fundamentals (infrastructure)
48+
- Homework is [here](materials/3-spark-fundamentals/homework/homework.md)
49+
- Testing Homework is [here](materials/3-spark-fundamentals/homework/homework_testing.md)
50+
- Day 1 Lecture is [here](https://www.dataexpert.io/lesson/apache-spark-day-1-lecture-yt)
51+
- Day 1 Lab is [here](https://www.dataexpert.io/lesson/apache-spark-day-1-lab-yt)
52+
- Day 2 Lecture is [here](https://www.dataexpert.io/lesson/apache-spark-day-2-lecture-yt)
53+
- Day 2 Lab is [here](https://www.dataexpert.io/lesson/apache-spark-day-2-lab-yt)
54+
- Day 3 Lecture is [here](https://www.dataexpert.io/lesson/apache-spark-day-3-lecture-yt)
55+
- Day 3 Lab is [here](https://www.dataexpert.io/lesson/apache-spark-day-3-lab-yt)

bootcamp/materials/1-dimensional-data-modeling/README.md

Lines changed: 14 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -81,39 +81,25 @@ There are two methods to get Postgres running locally.
8181
- You can check that your Docker Compose stack is running by either:
8282
- Going into Docker Desktop: you should see an entry there with a drop-down for each of the containers running in your Docker Compose stack.
8383
- Running **`docker ps -a`** and looking for the containers with the name **`postgres`**.
84-
- If you navigate to **`http://localhost:5050`** you will be able to see the PGAdmin instance up and running and should be able to connect to the following server:
85-
![Image showing the setup for PGAdmin](.attachments/pgadmin-server.png)
86-
Where:
87-
- Host name: host.docker.internal (Or container name i.e my-postgres-container)
88-
- Port: 5432
89-
- Username: postgres
90-
- Password: postgres
84+
- If you navigate to **`http://localhost:5050`** you will be able to see the PGAdmin instance up and running and should be able to connect to the following server as details shown:
85+
86+
<img src=".attachments/pgadmin-server.png" style="width:500px;"/>
9187
9288
93-
- When you're finished with your Postgres instance, you can stop the Docker Compose containers with:
89+
- When you're finished with your Postgres instance(required in week 1 & 2 & 4), you can stop the Docker Compose containers with:
9490

9591
```bash
96-
make down
92+
docker compose stop
9793
```
98-
99-
Or if you're on Windows:
100-
101-
```bash
102-
docker compose down -v
103-
```
104-
105-
### :rotating_light: **Need help loading tables?** :rotating_light:
10694

107-
> Refer to the instructions below to resolve the issue when the data dump fails to load tables, displaying the message `PostgreSQL Database directory appears to contain a database; Skipping initialization.`
108-
>
109-
110-
## :three: **Connect to Postgres in Database Client**
95+
## :three: **Connect to Postgres in Local Database Client**
11196

11297
- Some options for interacting with your Postgres instance:
113-
- DataGrip - JetBrains; 30-day free trial or paid version.
98+
- DataGrip - JetBrains; 30-day free trial or paid version
11499
- VSCode built-in extension (there are a few of these).
115100
- PGAdmin.
116101
- Postbird.
102+
- Dbeaver
117103
- Using your client of choice, follow the instructions to establish a new PostgreSQL connection.
118104
- The default username is **`postgres`** and corresponds to **`$POSTGRES_USER`** in your **`.env`**.
119105
- The default password is **`postgres`** and corresponds to **`$POSTGRES_PASSWORD`** in your **`.env`**.
@@ -125,28 +111,16 @@ Where:
125111

126112
- If the test connection is successful, click "Finish" or "Save" to save the connection. You should now be able to use the database client to manage your PostgreSQL database locally.
127113

114+
### :rotating_light: **Need help loading tables?** :rotating_light:
115+
116+
> Refer to the instructions below to resolve the issue when the data dump fails to load tables, displaying the message `PostgreSQL Database directory appears to contain a database; Skipping initialization.`
128117
## **🚨 Tables not loading!? 🚨**
129-
- If you're seeing errors about `error: invalid command \N`, you should use `pg_restore` to load `data.dump`.
130-
```bash
131-
pg_restore -U $POSTGRES_USER -d $POSTGRES_DB data.dump
132-
```
133-
- If you are on Windows and used **`docker compose up`**, table creation and data load will not take place with container creation. Once you have docker container up and verified that you are able to connect to empty postgres database with your own choice of client, follow the following steps:
134-
1. On Docker desktop, connect to my-postgres-container terminal.
135-
2. Run:
136-
```bash
137-
psql \
138-
-v ON_ERROR_STOP=1 \
139-
--username $POSTGRES_USER \
140-
--dbname $POSTGRES_DB \
141-
< /docker-entrypoint-initdb.d/data.dump
142-
```
143-
- → This will run the file `data.dump` from inside your docker container.
144118

145-
- If the tables don't come with the loaded data, follow these steps with manual installation of postgres:
119+
- If the tables don't come with the loaded data, follow these steps with manual local installation of Postgres:
146120
147121
1. Find where your `psql` client is installed (Something like `C:\\Program Files\\PostgreSQL\\13\\runpsql.bat`)
148122
2. Make sure you're in the root of the repo, and launch `psql` by running that `.bat` script
149-
3. Enter your credentials for postgres (described in the connect to postgres section)
123+
3. Enter your credentials for Postgres (described in the connect to Postgres section)
150124
- → If the above worked, you should now be inside a psql REPL (It looks like `postgres=#`)
151125
4. Run:
152126

@@ -156,7 +130,7 @@ Where:
156130

157131
- → This will run the file `data.dump` from inside your psql REPL.
158132

159-
- If you did the setup using Option 2, and the tables are not in the database, another solution is to:
133+
- If you did the setup using Option 2 which is Docker option, and the tables are not in the database, another solution is to:
160134

161135
1. Find the container id by running `docker ps` - under CONTAINER ID
162136
2. Go inside the container by executing `docker exec -it <container_name_or_id> bash`

0 commit comments

Comments
 (0)