Skip to content

Commit

Permalink
Update v1.1.0
Browse files Browse the repository at this point in the history
Update Minor Version 1.1.0
  • Loading branch information
taruma authored May 20, 2022
2 parents 74995d4 + 064be05 commit 02bf592
Show file tree
Hide file tree
Showing 8 changed files with 436 additions and 52 deletions.
8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Dashboard Rainfall Data Explorer
# hidrokit Rainfall Data Explorer (hidrokit-RDE)

![image](https://user-images.githubusercontent.com/1007910/167613715-7b3db12e-47e5-4d43-8765-19ac3551ed46.png)
![image](https://user-images.githubusercontent.com/1007910/169490220-d0b5a944-fa36-452b-b7e1-fcf5e04e415b.png)

__Rainfall Data Explorer__ (`hkrainfall`) adalah _dashboard_ untuk mengeksplorasi data hujan di setiap stasiunnya dan membandingkannya baik secara numerik maupun visual. `hkrainfall` dibuat menggunakan teknologi [Dash + Plotly](https://plotly.com/) dengan bahasa pemrograman Python. Proyek `hkrainfall` bersifat _open-source_ dengan lisensi MIT.
__Rainfall Data Explorer__ (`hkrainfall` / `hidrokit-RDE`) adalah _dashboard_ untuk mengeksplorasi data hujan di setiap stasiunnya dan membandingkannya baik secara numerik maupun visual. `hkrainfall` dibuat menggunakan teknologi [Dash + Plotly](https://plotly.com/) dengan bahasa pemrograman Python. Proyek `hkrainfall` bersifat _open-source_ dengan lisensi MIT.

## Cara Menjalankan Dashboard (Lokal)

Expand All @@ -29,6 +29,7 @@ Navigasi dashboard ini antara lain:
- Setelah tabel sudah dikoreksi. Bisa dilanjutkan ke tahapan analisis data. Perlu diingat, data yang dianalisis sesuai dengan tampilan/informasi tabel terkini. Jadi, jika masih ada filter, maka analisis hanya dilakukan pada data yang telah terfilter.
- Klik "Analyze Data" untuk melakukan analisis data. Perlu diingat, proses ini akan memakan waktu jika memiliki dataset yang besar. Jadi, sangat disarankan menggunakan mesin lokal untuk proses ini. Karena yang dapat diakses di web hanya berupa demonstrasi saja dan menggunakan layanan gratis sehingga sangat terbatas kemampuannya.
- Analisis data terbagi menjadi tiga periode yaitu 2 periode (biweekly), setiap bulanan (monthly), dan tahunan (yearly). Sebagai catatan, biweekly itu dibagi berdasarkan 16 hari pertama kemudian sisa harinya pada setiap bulan.
- __Baru di `v1.1.0`__: Analisis konsistensi (kurva massa ganda) dan kumulatif hujan tahunan.
- Analisis data yang dilakukan berupa:
- `days`: Jumlah hari pada setiap periodenya (16 hari untuk biweekly, 1 bulan untuk monthly, dan 1 tahun untuk yearly).
- `max`: Nilai maksimum pada setiap periode.
Expand All @@ -43,6 +44,7 @@ Navigasi dashboard ini antara lain:
- Group Bar Chart untuk setiap periode dengan kolom `max` dan `sum`. Grafik ini bisa melihat secara langsung perbandingan nilai antar stasiun.
- Stack Bar Chart untuk setiap periode dengan kolom `n_rain` dan `n_dry`. Grafik ini bisa memberikan gambaran periode yang memiliki frekuensi hujan/kekeringan tinggi/rendah secara sekilas.
- Bubble Chart (Maximum Rainfall Events) memberikan gambaran besar terkait tanggal kejadian saat hujan maksimum terjadi di setiap stasiun. Ukuran lingkaran menunjukkan seberapa besar hujan maksimum yang terjadi.
- __Baru di `v1.1.0`__: Ditambahkan grafik kumulatif hujan tahunan dan konsistensi (kurva massa ganda) untuk setiap stasiun.

Navigasi dengan grafik interaktif plotly:

Expand Down
166 changes: 148 additions & 18 deletions app.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,6 @@
"https://cdn.jsdelivr.net/gh/AnnMarieW/[email protected]/dbc.min.css"
)

# GLOBAL VARS
SUMMARY_ALL = None

# APP
app = dash.Dash(
APP_TITLE,
Expand All @@ -42,6 +39,7 @@
pylayout.HTML_TITLE,
pylayout.HTML_SUBTITLE,
pylayout.HTML_ALERT_README,
pylayout.HTML_ALERT_SPONSOR,
pylayout.HTML_ROW_BUTTON_UPLOAD,
pylayout.HTML_ROW_TABLE,
pylayout.HTML_ROW_BUTTON_VIZ,
Expand All @@ -51,6 +49,8 @@
pylayout.HTML_ROW_TABLE_ANALYZE,
pylayout.HTML_ROW_BUTTON_VIZ_ANALYSIS,
pylayout.HTML_ROW_GRAPH_ANALYSIS,
pylayout.HTML_ROW_GRAPH_CUMSUM,
pylayout.HTML_ROW_GRAPH_CONSISTENCY,
pylayout.HTML_ALERT_CONTRIBUTION,
pylayout.HTML_MADEBY,
pylayout.HTML_OTHER_PROJECTS,
Expand Down Expand Up @@ -94,12 +94,13 @@ def callback_upload(content, filename, filedate, _):
button_viz_outline = True

if dataframe is not None:
editable = [False] + [True] * len(dataframe.columns)
children = pylayoutfunc.create_table_layout(
dataframe,
"output-table",
filename=filename,
filedate=filedate,
editable=True,
editable=editable,
renamable=True,
)
upload_disabled = False
Expand Down Expand Up @@ -184,23 +185,40 @@ def callback_download_table(_, table_data, table_columns):
prevent_initial_call=True,
)
def callback_analyze(_, table_data, table_columns):
global SUMMARY_ALL

button_viz_analysis_disabled = True
button_viz_analysis_outline = True
row_button_download_analysis_style = {"visibility": "hidden"}

try:
dataframe = pyfunc.transform_to_dataframe(table_data, table_columns)
SUMMARY_ALL = pyfunc.generate_summary_all(dataframe, n_days=["16D", "MS", "YS"])
tables = [

# SUMMARY
summary_all = pyfunc.generate_summary_all(dataframe, n_days=["16D", "MS", "YS"])
tables_summary = [
pylayoutfunc.create_table_summary(
summary, f"table-analyze-{counter}", deletable=False
)
for counter, summary in enumerate(SUMMARY_ALL)
for counter, summary in enumerate(summary_all)
]

children = pylayoutfunc.create_tabcard_table_layout(tables)
# CUMUMLATIVE SUM
cumsum = pyfunc.calc_cumsum(dataframe)

_, table_cumsum = pylayoutfunc.create_table_layout(
cumsum, "table-cumsum", deletable=False
)

table_cumsum = [table_cumsum]

# LAYOUT
tables_all = tables_summary + table_cumsum
tab_names = "Biweekly Monthly Yearly Cumulative".split()

children = pylayoutfunc.create_tabcard_table_layout(
tables_all, tab_names=tab_names
)

button_viz_analysis_disabled = False
button_viz_analysis_outline = False
row_button_download_analysis_style = {"visibility": "visible"}
Expand All @@ -218,51 +236,163 @@ def callback_analyze(_, table_data, table_columns):
@app.callback(
Output("download-analysis-csv", "data"),
Input("button-download-analysis-csv", "n_clicks"),
State("table-analyze-0", "data"),
State("table-analyze-0", "columns"),
State("table-analyze-1", "data"),
State("table-analyze-1", "columns"),
State("table-analyze-2", "data"),
State("table-analyze-2", "columns"),
State("table-cumsum", "data"),
State("table-cumsum", "columns"),
prevent_initial_call=True,
)
def callback_download_results(_):
def callback_download_results(
_,
biweekly_data,
biweekly_columns,
monthly_data,
monthly_columns,
yearly_data,
yearly_columns,
cumsum_data,
cumsum_columns,
):

biweekly = (biweekly_data, biweekly_columns)
monthly = (monthly_data, monthly_columns)
yearly = (yearly_data, yearly_columns)

summary_all = []
for period in (biweekly, monthly, yearly):
data, columns = period
dataframe = pyfunc.transform_to_dataframe(
data,
columns,
multiindex=True,
apply_numeric=False,
parse_dates=["max_date"],
)
summary_all.append(dataframe)

dataframe = pd.concat(SUMMARY_ALL, axis=1, keys=["Biweekly", "Monthly", "Yearly"])
return dcc.send_data_frame(dataframe.to_csv, "results.csv")
cumsum = pyfunc.transform_to_dataframe(cumsum_data, cumsum_columns)
stations = cumsum.columns.to_list()
cumsum.columns = pd.MultiIndex.from_product([stations, [""]])

dataframe_all = pd.concat(
summary_all + [cumsum],
axis=1,
keys=["Biweekly", "Monthly", "Yearly", "Cumulative"],
)

return dcc.send_data_frame(dataframe_all.to_csv, "results.csv")


@app.callback(
Output("tab-graph-analysis", "children"),
Output("tab-graph-cumsum", "children"),
Output("tab-graph-consistency", "children"),
Input("button-viz-analysis", "n_clicks"),
State("table-analyze-0", "data"),
State("table-analyze-0", "columns"),
State("table-analyze-1", "data"),
State("table-analyze-1", "columns"),
State("table-analyze-2", "data"),
State("table-analyze-2", "columns"),
State("table-cumsum", "data"),
State("table-cumsum", "columns"),
prevent_initial_call=True,
)
def callback_troubleshoot(_):
def callback_graph_analysis(
_,
biweekly_data,
biweekly_columns,
monthly_data,
monthly_columns,
yearly_data,
yearly_columns,
cumsum_data,
cumsum_columns,
):
from itertools import product

label_periods = ["Biweekly", "Monthly", "Yearly"]
label_maxsum = ["Max + Sum"]
label_raindry = ["Dry + Rain"]
label_ufunc = label_maxsum + label_raindry

biweekly = (biweekly_data, biweekly_columns)
monthly = (monthly_data, monthly_columns)
yearly = (yearly_data, yearly_columns)

summary_all = []
for summary_period in (biweekly, monthly, yearly):
data, columns = summary_period
dataframe = pyfunc.transform_to_dataframe(
data,
columns,
multiindex=True,
apply_numeric=False,
parse_dates=["max_date"],
)
summary_all.append(dataframe)

graphs_maxsum = [
pyfigure.figure_summary_maxsum(
summary,
title=f"<b>{period}: {title}</b>",
period=period,
subplot_titles=["Max", "Sum"],
)
for summary, title, period in zip(SUMMARY_ALL, label_maxsum * 3, label_periods)
for summary, title, period in zip(summary_all, label_maxsum * 3, label_periods)
]
graphs_raindry = [
pyfigure.figure_summary_raindry(
summary, title=f"<b>{period}: {title}</b>", period=period
)
for summary, title, period in zip(SUMMARY_ALL, label_raindry * 3, label_periods)
for summary, title, period in zip(summary_all, label_raindry * 3, label_periods)
]
graph_maxdate = [pyfigure.figure_summary_maxdate(SUMMARY_ALL)]
graph_maxdate = [pyfigure.figure_summary_maxdate(summary_all)]

all_graphs = graphs_maxsum + graphs_raindry + graph_maxdate
labels = [": ".join(i) for i in product(label_ufunc, label_periods)]
labels += ["Maximum Rainfall Events"]

children = pylayoutfunc.create_tabcard_graph_layout(all_graphs, labels)
children_analysis = pylayoutfunc.create_tabcard_graph_layout(
all_graphs, labels, active_tab="Maximum Rainfall Events"
)

# CUMSUM

cumsum = pyfunc.transform_to_dataframe(cumsum_data, cumsum_columns)

graph_cumsum = [
pyfigure.figure_cumsum_single(cumsum, col=station) for station in cumsum.columns
]

children_cumsum = pylayoutfunc.create_tabcard_graph_layout(
graph_cumsum, cumsum.columns
)

# CONSISTENCY

if cumsum.columns.size == 1:
children_consistency = (
dcc.Graph(
figure=pyfigure.figure_empty(text="Not Available for Single Station"),
config={"staticPlot": True},
),
)
else:
graph_consistency = [
pyfigure.figure_consistency(cumsum, col=station)
for station in cumsum.columns
]

children_consistency = pylayoutfunc.create_tabcard_graph_layout(
graph_consistency, cumsum.columns
)

return children
return children_analysis, children_cumsum, children_consistency


@app.callback(
Expand Down
4 changes: 2 additions & 2 deletions app_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,14 @@ DASH_APP:
DEBUG: False

DASH_THEME:
THEME: SIMPLEX
THEME: COSMO

TEMPLATE:
LOGO_SOURCE: https://raw.githubusercontent.com/hidrokit/static-assets/main/logo_0.4.0-v1.1/hidrokit-hidrokit/50x50square.png
WATERMARK_SOURCE: https://raw.githubusercontent.com/hidrokit/static-assets/main/logo_0.4.0-v1.1/hidrokit-hidrokit/400x100/400x100transparent.png
SHOW_LEGEND_INSIDE: False
SHOW_RANGESELECTOR: False

VERSION: v1.0.0
VERSION: v1.1.0
GITHUB_LINK: https://github.com/taruma/dash-hidrokit-rainfall
GITHUB_REPO: taruma/dash-hidrokit-rainfall
Loading

0 comments on commit 02bf592

Please sign in to comment.