Skip to content

docs: Add circuit breaker user guide #203

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

DeshDeepakKant
Copy link
Contributor

This PR adds a comprehensive, user-friendly guide for configuring and using circuit breakers in Kmesh. The guide includes:

  • Basic explanation of circuit breaker concepts
  • Configuration parameters with default values
  • Step-by-step implementation instructions
  • Monitoring and troubleshooting tips

Copy link

netlify bot commented Jun 10, 2025

Deploy Preview for kmesh-net ready!

Name Link
🔨 Latest commit 7c8a2db
🔍 Latest deploy log https://app.netlify.com/projects/kmesh-net/deploys/6849128c5493720008577d4a
😎 Deploy Preview https://deploy-preview-203--kmesh-net.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@kmesh-bot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign kevin-wangzefeng for approval. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@DeshDeepakKant DeshDeepakKant force-pushed the add-circuit-breaker-guide branch from ead3c4a to f68e6a0 Compare June 10, 2025 08:11
@DeshDeepakKant
Copy link
Contributor Author

I have ran this docs on my local Linux setup and its working fine here is the terminal log for reference

anya@pop-os:~/kmesh$ kubectl apply -f sample-app.yaml
deployment.apps/httpbin-demo created
service/httpbin-demo created
anya@pop-os:~/kmesh$ kubectl apply -f fortio.yaml
deployment.apps/fortio-demo created
service/fortio-demo created
anya@pop-os:~/kmesh$ kubectl label namespace default istio-injection=enabled --overwrite
namespace/default not labeledanya@pop-os:~/kmesh$ kubectl get pods
NAME                            READY   STATUS    RESTARTS       AGE
client-test-586bf9f85-z75g9     2/2     Running   18 (38m ago)   3d4h
fortio-8448778768-7gx7m         1/1     Running   9 (38m ago)    3d4h
fortio-demo-57459944b-rczq6     2/2     Running   0              10s
httpbin-55ffdcd94-gmd2d         1/1     Running   11 (38m ago)   3d18h
httpbin-demo-54bcd57df4-vgztk   2/2     Running   0              14s
httpbin-test-5cfc5dcdf5-b2wxc   2/2     Running   18 (38m ago)   3d4h
httpbin-test-5cfc5dcdf5-dc8rj   2/2     Running   18 (38m ago)   3d4h
httpbin-test-5cfc5dcdf5-h8cwd   2/2     Running   18 (38m ago)   3d4h
sleep-5577c64d7c-snfdt          1/1     Running   11 (38m ago)   3d18h
test-service-776c9c59fb-jxzl6   1/1     Running   9 (38m ago)    3d4h
anya@pop-os:~/kmesh$ kubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 20 -qps 0 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/gekubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 20 -qps 0 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/get
06:06:49.616 r1 [INF] logger.go:298> Log level is now 3 Warning (was 2 Info)
Fortio 1.69.5 running at 0 queries per second, 12->12 procs, for 50 calls: http://httpbin-d
emo:8000/get
Starting at max qps with 20 thread(s) [gomax 12] for exactly 50 calls (2 per thread + 10)
Ended after 266.476109ms : 50 calls. qps=187.63
Aggregated Function Time : count 50 avg 0.062245063 +/- 0.05164 min 0.001688718 max 0.15757
4178 sum 3.11225315
# range, mid point, percentile, count
>= 0.00168872 <= 0.002 , 0.00184436 , 16.00, 8
> 0.004 <= 0.005 , 0.0045 , 18.00, 1
> 0.008 <= 0.009 , 0.0085 , 20.00, 1
> 0.009 <= 0.01 , 0.0095 , 22.00, 1
> 0.01 <= 0.011 , 0.0105 , 26.00, 2
> 0.011 <= 0.012 , 0.0115 , 28.00, 1
> 0.012 <= 0.014 , 0.013 , 30.00, 1
> 0.014 <= 0.016 , 0.015 , 32.00, 1
> 0.04 <= 0.045 , 0.0425 , 46.00, 7
> 0.045 <= 0.05 , 0.0475 , 58.00, 6
> 0.05 <= 0.06 , 0.055 , 62.00, 2
> 0.08 <= 0.09 , 0.085 , 64.00, 1
> 0.1 <= 0.12 , 0.11 , 90.00, 13
> 0.14 <= 0.157574 , 0.148787 , 100.00, 5
# target 50% 0.0466667
# target 75% 0.108462
# target 90% 0.12
# target 99% 0.155817
# target 99.9% 0.157398
Error cases : no data
# Socket and IP used for each connection:
[0]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000925
215 +/- 0 min 0.000925215 max 0.000925215 sum 0.000925215
[1]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000244
556 +/- 0 min 0.000244556 max 0.000244556 sum 0.000244556
[2]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000198
129 +/- 0 min 0.000198129 max 0.000198129 sum 0.000198129
[3]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000163
014 +/- 0 min 0.000163014 max 0.000163014 sum 0.000163014
[4]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000120
755 +/- 0 min 0.000120755 max 0.000120755 sum 0.000120755
[5]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000415
565 +/- 0 min 0.000415565 max 0.000415565 sum 0.000415565
[6]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000117
088 +/- 0 min 0.000117088 max 0.000117088 sum 0.000117088
[7]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000100
938 +/- 0 min 0.000100938 max 0.000100938 sum 0.000100938
[8]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000181
929 +/- 0 min 0.000181929 max 0.000181929 sum 0.000181929
[9]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000104
855 +/- 0 min 0.000104855 max 0.000104855 sum 0.000104855
[10]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00015
7634 +/- 0 min 0.000157634 max 0.000157634 sum 0.000157634
[11]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00017
7871 +/- 0 min 0.000177871 max 0.000177871 sum 0.000177871
[12]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00027
6706 +/- 0 min 0.000276706 max 0.000276706 sum 0.000276706
[13]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00010
8312 +/- 0 min 0.000108312 max 0.000108312 sum 0.000108312
[14]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 9.7833e
-05 +/- 0 min 9.7833e-05 max 9.7833e-05 sum 9.7833e-05
[15]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 8.3696e
-05 +/- 0 min 8.3696e-05 max 8.3696e-05 sum 8.3696e-05
[16]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00011
1558 +/- 0 min 0.000111558 max 0.000111558 sum 0.000111558
[17]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00010
6478 +/- 0 min 0.000106478 max 0.000106478 sum 0.000106478
[18]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00011
3281 +/- 0 min 0.000113281 max 0.000113281 sum 0.000113281
[19]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00036
9659 +/- 0 min 0.000369659 max 0.000369659 sum 0.000369659
Connection time (s) : count 20 avg 0.0002087536 +/- 0.0001871 min 8.3696e-05 max 0.00092521
5 sum 0.004175072
Sockets used: 20 (for perfect keepalive, would be 20)
Uniform: false, Jitter: false, Catchup allowed: true
IP addresses distribution:
10.96.66.60:8000: 20
Code 200 : 50 (100.0 %)
Response Header Sizes : count 50 avg 231.12 +/- 0.7652 min 230 max 232 sum 11556
Response Body/Total Sizes : count 50 avg 666.12 +/- 0.7652 min 665 max 667 sum 33306
All done 50 calls (plus 0 warmup) 62.245 ms avg, 187.6 qps
anya@pop-os:~/kmesh$ kubectl apply -f circuit-breaker.yaml
destinationrule.networking.istio.io/httpbin-demo-cb created
anya@pop-os:~/kmesh$ kubectl get destinationrule httpbin-demo-cb -o yaml
apiVersion: networking.istio.io/v1
kind: DestinationRule
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"networking.istio.io/v1alpha3","kind":"DestinationRule","metadata":{"an
notations":{},"name":"httpbin-demo-cb","namespace":"default"},"spec":{"host":"httpbin-demo"
,"trafficPolicy":{"connectionPool":{"http":{"http1MaxPendingRequests":1,"maxRequestsPerConn
ection":1},"tcp":{"maxConnections":1}},"outlierDetection":{"baseEjectionTime":"3s","consecu
tive5xxErrors":1,"interval":"1s"}}}}
  creationTimestamp: "2025-06-10T06:07:12Z"
  generation: 1
  name: httpbin-demo-cb
  namespace: default
  resourceVersion: "180210"
  uid: 1f38cb7b-fc2e-40d9-bd41-eb82df0e9b14
spec:
  host: httpbin-demo
  trafficPolicy:
    connectionPool:
      http:
        http1MaxPendingRequests: 1
        maxRequestsPerConnection: 1
      tcp:
        maxConnections: 1
    outlierDetection:
      baseEjectionTime: 3s
      consecutive5xxErrors: 1
      interval: 1s
anya@pop-os:~/kmesh$ kubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 20 -qps 0 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/gekubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 20 -qps 0 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/get
06:07:26.527 r1 [INF] logger.go:298> Log level is now 3 Warning (was 2 Info)
Fortio 1.69.5 running at 0 queries per second, 12->12 procs, for 50 calls: http://httpbin-d
emo:8000/get
Starting at max qps with 20 thread(s) [gomax 12] for exactly 50 calls (2 per thread + 10)
06:07:26.544 r40 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:07:26.544 r42 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=6, run=0
06:07:26.544 r41 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=5, run=0
06:07:26.544 r98 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=14, run=0
06:07:26.545 r38 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:07:26.545 r43 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=7, run=0
06:07:26.545 r99 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=15, run=0
06:07:26.545 r103 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=19, run=0
06:07:26.546 r102 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=18, run=0
06:07:26.546 r48 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=12, run=0
06:07:26.546 r47 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=11, run=0
06:07:26.547 r100 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=16, run=0
06:07:26.547 r36 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:07:26.547 r40 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:07:26.548 r37 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=1, run=0
06:07:26.548 r48 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=12, run=0
06:07:26.548 r97 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=13, run=0
06:07:26.549 r39 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:07:26.549 r98 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=14, run=0
06:07:26.549 r36 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:07:26.550 r37 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=1, run=0
06:07:26.550 r42 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=6, run=0
06:07:26.550 r41 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=5, run=0
06:07:26.551 r43 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=7, run=0
06:07:26.551 r38 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:07:26.552 r99 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=15, run=0
06:07:26.552 r44 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=8, run=0
06:07:26.553 r103 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=19, run=0
06:07:26.554 r47 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=11, run=0
06:07:26.554 r102 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=18, run=0
06:07:26.555 r46 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=10, run=0
06:07:26.555 r100 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=16, run=0
06:07:26.556 r39 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:07:26.556 r45 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=9, run=0
06:07:26.556 r36 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:07:26.557 r36 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:07:26.559 r36 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:07:26.559 r101 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 5
03", thread=17, run=0
Ended after 46.714207ms : 50 calls. qps=1070.3
Aggregated Function Time : count 50 avg 0.0062017999 +/- 0.003819 min 0.000356315 max 0.019
379158 sum 0.310089995
# range, mid point, percentile, count
>= 0.000356315 <= 0.001 , 0.000678158 , 4.00, 2
> 0.001 <= 0.002 , 0.0015 , 12.00, 4
> 0.002 <= 0.003 , 0.0025 , 20.00, 4
> 0.003 <= 0.004 , 0.0035 , 30.00, 5
> 0.004 <= 0.005 , 0.0045 , 34.00, 2
> 0.005 <= 0.006 , 0.0055 , 48.00, 7
> 0.006 <= 0.007 , 0.0065 , 66.00, 9
> 0.007 <= 0.008 , 0.0075 , 80.00, 7
> 0.008 <= 0.009 , 0.0085 , 86.00, 3
> 0.009 <= 0.01 , 0.0095 , 90.00, 2
> 0.012 <= 0.014 , 0.013 , 94.00, 2
> 0.014 <= 0.016 , 0.015 , 98.00, 2
> 0.018 <= 0.0193792 , 0.0186896 , 100.00, 1
# target 50% 0.00611111
# target 75% 0.00764286
# target 90% 0.01
# target 99% 0.0186896
# target 99.9% 0.0193102
Error cases : count 38 avg 0.0054972308 +/- 0.002523 min 0.000356315 max 0.009876392 sum 0.
208894769
# range, mid point, percentile, count
>= 0.000356315 <= 0.001 , 0.000678158 , 5.26, 2
> 0.001 <= 0.002 , 0.0015 , 15.79, 4
> 0.002 <= 0.003 , 0.0025 , 23.68, 3
> 0.003 <= 0.004 , 0.0035 , 26.32, 1
> 0.004 <= 0.005 , 0.0045 , 28.95, 1
> 0.005 <= 0.006 , 0.0055 , 44.74, 6
> 0.006 <= 0.007 , 0.0065 , 68.42, 9
> 0.007 <= 0.008 , 0.0075 , 86.84, 7
> 0.008 <= 0.009 , 0.0085 , 94.74, 3
> 0.009 <= 0.00987639 , 0.0094382 , 100.00, 2
# target 50% 0.00622222
# target 75% 0.00735714
# target 90% 0.0084
# target 99% 0.00970988
# target 99.9% 0.00985974
# Socket and IP used for each connection:
[0]   6 socket used, resolved to 10.96.66.60:8000, connection timing : count 6 avg 0.000269
95667 +/- 0.00017 min 0.000135372 max 0.000631818 sum 0.00161974
[1]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000407
349 +/- 0.0001648 min 0.000242532 max 0.000572166 sum 0.000814698
[2]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000500
072 +/- 0.0001212 min 0.000378896 max 0.000621248 sum 0.001000144
[3]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000388
229 +/- 9.653e-05 min 0.000291694 max 0.000484764 sum 0.000776458
[4]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000309
3315 +/- 0.0001311 min 0.000178202 max 0.000440461 sum 0.000618663
[5]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000622
0995 +/- 0.0002605 min 0.000361594 max 0.000882605 sum 0.001244199
[6]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000379
612 +/- 2.897e-05 min 0.000350643 max 0.000408581 sum 0.000759224
[7]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.000726
104 +/- 0.0001522 min 0.00057388 max 0.000878328 sum 0.001452208
[8]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000161
281 +/- 0 min 0.000161281 max 0.000161281 sum 0.000161281
[9]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000254
865 +/- 0 min 0.000254865 max 0.000254865 sum 0.000254865
[10]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00019
5064 +/- 0 min 0.000195064 max 0.000195064 sum 0.000195064
[11]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00024
6735 +/- 5.115e-05 min 0.000195585 max 0.000297885 sum 0.00049347
[12]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00013
5378 +/- 2.839e-05 min 0.00010699 max 0.000163766 sum 0.000270756
[13]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00022
37725 +/- 2.864e-05 min 0.000195134 max 0.000252411 sum 0.000447545
[14]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00040
4669 +/- 0.0001225 min 0.000282126 max 0.000527212 sum 0.000809338
[15]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00015
58705 +/- 4.365e-05 min 0.000112219 max 0.000199522 sum 0.000311741
[16]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00024
1956 +/- 3.108e-05 min 0.000210873 max 0.000273039 sum 0.000483912
[17]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.00013
0002 +/- 0 min 0.000130002 max 0.000130002 sum 0.000130002
[18]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00016
56135 +/- 3.46e-05 min 0.000131014 max 0.000200213 sum 0.000331227
[19]   2 socket used, resolved to 10.96.66.60:8000, connection timing : count 2 avg 0.00046
58185 +/- 0.0002057 min 0.000260075 max 0.000671562 sum 0.000931637
Connection time (s) : count 40 avg 0.0003276543 +/- 0.0001998 min 0.00010699 max 0.00088260
5 sum 0.013106172
Sockets used: 40 (for perfect keepalive, would be 20)
Uniform: false, Jitter: false, Catchup allowed: true
IP addresses distribution:
10.96.66.60:8000: 40
Code 200 : 12 (24.0 %)
Code 503 : 38 (76.0 %)
Response Header Sizes : count 50 avg 55.24 +/- 98.3 min 0 max 231 sum 2762
Response Body/Total Sizes : count 50 avg 342.8 +/- 181.2 min 241 max 666 sum 17140
All done 50 calls (plus 0 warmup) 6.202 ms avg, 1070.3 qps
anya@pop-os:~/kmesh$ kubectl exec $(kubectl get pod -l app=fortio-demo -o jsonpath={.items[
0].metadata.name}) -ckubectl exec $(kubectl get pod -l app=fortio-demo -o jsonpath={.items[
0].metadata.name}) -c istio-proxy -- pilot-agent request GET stats | grep circuit_breaker
cluster.xds-grpc;.circuit_breakers.default.cx_open: 0
cluster.xds-grpc;.circuit_breakers.default.cx_pool_open: 0
cluster.xds-grpc;.circuit_breakers.default.rq_open: 0
cluster.xds-grpc;.circuit_breakers.default.rq_pending_open: 0
cluster.xds-grpc;.circuit_breakers.default.rq_retry_open: 0
cluster.xds-grpc;.circuit_breakers.high.cx_open: 0
cluster.xds-grpc;.circuit_breakers.high.cx_pool_open: 0
cluster.xds-grpc;.circuit_breakers.high.rq_open: 0
cluster.xds-grpc;.circuit_breakers.high.rq_pending_open: 0
cluster.xds-grpc;.circuit_breakers.high.rq_retry_open: 0
istiocustom.istio_requests_total.reporter.source.source_workload.fortio-demo.source_canonic
al_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.s
ource_principal.spiffe://cluster.local/ns/default/sa/default.source_app.fortio-demo.source_
version.source_cluster.Kubernetes.destination_workload.httpbin-demo.destination_workload_na
mespace.default.destination_principal.spiffe://cluster.local/ns/default/sa/default.destinat
ion_app.httpbin-demo.destination_version.latest.destination_service.httpbin-demo.default.sv
c.cluster.local.destination_canonical_service.httpbin-demo.destination_canonical_revision.l
atest.destination_service_name.httpbin-demo.destination_service_namespace.default.destinati
on_cluster.Kubernetes.request_protocol.http.response_code.200.grpc_response_status.response
_flags.-.connection_security_policy.unknown: 62
istiocustom.istio_requests_total.reporter.source.source_workload.fortio-demo.source_canonic
al_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.s
ource_principal.unknown.source_app.fortio-demo.source_version.source_cluster.Kubernetes.des
tination_workload.unknown.destination_workload_namespace.unknown.destination_principal.unkn
own.destination_app.unknown.destination_version.unknown.destination_service.httpbin-demo.de
fault.svc.cluster.local.destination_canonical_service.unknown.destination_canonical_revisio
n.latest.destination_service_name.httpbin-demo.destination_service_namespace.default.destin
ation_cluster.unknown.request_protocol.http.response_code.503.grpc_response_status.response
_flags.UO.connection_security_policy.unknown: 38
istiocustom.istio_request_bytes.reporter.source.source_workload.fortio-demo.source_canonica
l_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.so
urce_principal.spiffe://cluster.local/ns/default/sa/default.source_app.fortio-demo.source_v
ersion.source_cluster.Kubernetes.destination_workload.httpbin-demo.destination_workload_nam
espace.default.destination_principal.spiffe://cluster.local/ns/default/sa/default.destinati
on_app.httpbin-demo.destination_version.latest.destination_service.httpbin-demo.default.svc
.cluster.local.destination_canonical_service.httpbin-demo.destination_canonical_revision.la
test.destination_service_name.httpbin-demo.destination_service_namespace.default.destinatio
n_cluster.Kubernetes.request_protocol.http.response_code.200.grpc_response_status.response_
flags.-.connection_security_policy.unknown: P0(nan,84) P25(nan,84.25) P50(nan,84.5) P75(nan
,84.75) P90(nan,84.9) P95(nan,84.95) P99(nan,84.99) P99.5(nan,84.995) P99.9(nan,84.999) P10
0(nan,85)
istiocustom.istio_request_bytes.reporter.source.source_workload.fortio-demo.source_canonica
l_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.so
urce_principal.unknown.source_app.fortio-demo.source_version.source_cluster.Kubernetes.dest
ination_workload.unknown.destination_workload_namespace.unknown.destination_principal.unkno
wn.destination_app.unknown.destination_version.unknown.destination_service.httpbin-demo.def
ault.svc.cluster.local.destination_canonical_service.unknown.destination_canonical_revision
.latest.destination_service_name.httpbin-demo.destination_service_namespace.default.destina
tion_cluster.unknown.request_protocol.http.response_code.503.grpc_response_status.response_
flags.UO.connection_security_policy.unknown: P0(nan,84) P25(nan,84.25) P50(nan,84.5) P75(na
n,84.75) P90(nan,84.9) P95(nan,84.95) P99(nan,84.99) P99.5(nan,84.995) P99.9(nan,84.999) P1
00(nan,85)
istiocustom.istio_request_duration_milliseconds.reporter.source.source_workload.fortio-demo
.source_canonical_service.fortio-demo.source_canonical_revision.latest.source_workload_name
space.default.source_principal.spiffe://cluster.local/ns/default/sa/default.source_app.fort
io-demo.source_version.source_cluster.Kubernetes.destination_workload.httpbin-demo.destinat
ion_workload_namespace.default.destination_principal.spiffe://cluster.local/ns/default/sa/d
efault.destination_app.httpbin-demo.destination_version.latest.destination_service.httpbin-
demo.default.svc.cluster.local.destination_canonical_service.httpbin-demo.destination_canon
ical_revision.latest.destination_service_name.httpbin-demo.destination_service_namespace.de
fault.destination_cluster.Kubernetes.request_protocol.http.response_code.200.grpc_response_
status.response_flags.-.connection_security_policy.unknown: P0(nan,1) P25(nan,4.08333333333
3333) P50(nan,30.5) P75(nan,108.33333333333333) P90(nan,118.80000000000001) P95(nan,133.8)
P99(nan,138.76) P99.5(nan,139.38) P99.9(nan,139.876) P100(nan,140)
istiocustom.istio_request_duration_milliseconds.reporter.source.source_workload.fortio-demo
.source_canonical_service.fortio-demo.source_canonical_revision.latest.source_workload_name
space.default.source_principal.unknown.source_app.fortio-demo.source_version.source_cluster
.Kubernetes.destination_workload.unknown.destination_workload_namespace.unknown.destination
_principal.unknown.destination_app.unknown.destination_version.unknown.destination_service.
httpbin-demo.default.svc.cluster.local.destination_canonical_service.unknown.destination_ca
nonical_revision.latest.destination_service_name.httpbin-demo.destination_service_namespace
.default.destination_cluster.unknown.request_protocol.http.response_code.503.grpc_response_
status.response_flags.UO.connection_security_policy.unknown: P0(nan,0) P25(nan,0) P50(nan,0
) P75(nan,0) P90(nan,0) P95(nan,0) P99(nan,0) P99.5(nan,0) P99.9(nan,0) P100(nan,0)
istiocustom.istio_response_bytes.reporter.source.source_workload.fortio-demo.source_canonic
al_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.s
ource_principal.spiffe://cluster.local/ns/default/sa/default.source_app.fortio-demo.source_
version.source_cluster.Kubernetes.destination_workload.httpbin-demo.destination_workload_na
mespace.default.destination_principal.spiffe://cluster.local/ns/default/sa/default.destinat
ion_app.httpbin-demo.destination_version.latest.destination_service.httpbin-demo.default.sv
c.cluster.local.destination_canonical_service.httpbin-demo.destination_canonical_revision.l
atest.destination_service_name.httpbin-demo.destination_service_namespace.default.destinati
on_cluster.Kubernetes.request_protocol.http.response_code.200.grpc_response_status.response
_flags.-.connection_security_policy.unknown: P0(nan,660) P25(nan,662.5) P50(nan,665) P75(na
n,667.5) P90(nan,669) P95(nan,669.5) P99(nan,669.9) P99.5(nan,669.95) P99.9(nan,669.99) P10
0(nan,670)
istiocustom.istio_response_bytes.reporter.source.source_workload.fortio-demo.source_canonic
al_service.fortio-demo.source_canonical_revision.latest.source_workload_namespace.default.s
ource_principal.unknown.source_app.fortio-demo.source_version.source_cluster.Kubernetes.des
tination_workload.unknown.destination_workload_namespace.unknown.destination_principal.unkn
own.destination_app.unknown.destination_version.unknown.destination_service.httpbin-demo.de
fault.svc.cluster.local.destination_canonical_service.unknown.destination_canonical_revisio
n.latest.destination_service_name.httpbin-demo.destination_service_namespace.default.destin
ation_cluster.unknown.request_protocol.http.response_code.503.grpc_response_status.response
_flags.UO.connection_security_policy.unknown: P0(nan,240) P25(nan,242.5) P50(nan,245) P75(n
an,247.5) P90(nan,249) P95(nan,249.5) P99(nan,249.9) P99.5(nan,249.95) P99.9(nan,249.99) P1
00(nan,250)

anya@pop-os:~/kmesh$ kubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 5 -qps 10 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/gekubectl exec -it $(kubectl get pod -l app=fortio-demo -o jsonpath={.it
ems[0].metadata.name}) -c fortio -- fortio load -c 5 -qps 10 -n 50 -loglevel Warning http:/
/httpbin-demo:8000/get
06:08:07.096 r1 [INF] logger.go:298> Log level is now 3 Warning (was 2 Info)
Fortio 1.69.5 running at 10 queries per second, 12->12 procs, for 50 calls: http://httpbin-
demo:8000/get
Starting at 10 qps with 5 thread(s) [gomax 12] : exactly 50, 10 calls each (total 50 + 0)
06:08:07.100 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:07.100 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:07.101 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:08:07.101 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:07.659 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:08:07.659 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:07.659 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:07.659 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:08.213 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:08.214 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:08:08.214 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:08.769 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:08.770 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:08.770 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:09.324 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:09.324 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:09.324 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:09.880 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:09.880 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:09.880 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:10.436 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:10.437 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:10.437 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:10.991 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:10.991 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:10.991 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:11.000 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:08:11.547 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:11.547 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:11.547 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
06:08:11.547 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:12.101 r62 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=4, run=0
06:08:12.101 r61 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=3, run=0
06:08:12.102 r58 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=0, run=0
06:08:12.102 r60 [WRN] http_client.go:1151> Non ok http code, code=503, status="HTTP/1.1 50
3", thread=2, run=0
Ended after 5.004305089s : 50 calls. qps=9.9914
Sleep times : count 45 avg 0.55102679 +/- 0.002249 min 0.544093835 max 0.554579866 sum 24.7
962055
Aggregated Function Time : count 50 avg 0.0026031657 +/- 0.002025 min 0.00092809 max 0.0096
39721 sum 0.130158286
# range, mid point, percentile, count
>= 0.00092809 <= 0.001 , 0.000964045 , 6.00, 3
> 0.001 <= 0.002 , 0.0015 , 60.00, 27
> 0.002 <= 0.003 , 0.0025 , 68.00, 4
> 0.003 <= 0.004 , 0.0035 , 78.00, 5
> 0.004 <= 0.005 , 0.0045 , 86.00, 4
> 0.005 <= 0.006 , 0.0055 , 92.00, 3
> 0.006 <= 0.007 , 0.0065 , 96.00, 2
> 0.008 <= 0.009 , 0.0085 , 98.00, 1
> 0.009 <= 0.00963972 , 0.00931986 , 100.00, 1
# target 50% 0.00181481
# target 75% 0.0037
# target 90% 0.00566667
# target 99% 0.00931986
# target 99.9% 0.00960773
Error cases : count 35 avg 0.0016426127 +/- 0.001407 min 0.00092809 max 0.009639721 sum 0.0
57491445
# range, mid point, percentile, count
>= 0.00092809 <= 0.001 , 0.000964045 , 8.57, 3
> 0.001 <= 0.002 , 0.0015 , 85.71, 27
> 0.002 <= 0.003 , 0.0025 , 97.14, 4
> 0.009 <= 0.00963972 , 0.00931986 , 100.00, 1
# target 50% 0.00153704
# target 75% 0.00186111
# target 90% 0.002375
# target 99% 0.00941582
# target 99.9% 0.00961733
# Socket and IP used for each connection:
[0]  10 socket used, resolved to 10.96.66.60:8000, connection timing : count 10 avg 0.00015
59387 +/- 3.219e-05 min 0.000111268 max 0.000218758 sum 0.001559387
[1]   1 socket used, resolved to 10.96.66.60:8000, connection timing : count 1 avg 0.000182
961 +/- 0 min 0.000182961 max 0.000182961 sum 0.000182961
[2]   6 socket used, resolved to 10.96.66.60:8000, connection timing : count 6 avg 0.000170
5845 +/- 2.281e-05 min 0.00014497 max 0.000206224 sum 0.001023507
[3]   9 socket used, resolved to 10.96.66.60:8000, connection timing : count 9 avg 0.000170
89556 +/- 3.916e-05 min 0.000103724 max 0.000224709 sum 0.00153806
[4]  10 socket used, resolved to 10.96.66.60:8000, connection timing : count 10 avg 0.00018
23889 +/- 3.909e-05 min 0.000119322 max 0.000259514 sum 0.001823889
Connection time (s) : count 36 avg 0.00017021678 +/- 3.584e-05 min 0.000103724 max 0.000259
514 sum 0.006127804
Sockets used: 36 (for perfect keepalive, would be 5)
Uniform: false, Jitter: false, Catchup allowed: true
IP addresses distribution:
10.96.66.60:8000: 36
Code 200 : 15 (30.0 %)
Code 503 : 35 (70.0 %)
Response Header Sizes : count 50 avg 69 +/- 105.4 min 0 max 230 sum 3450
Response Body/Total Sizes : count 50 avg 368.88 +/- 193.9 min 241 max 665 sum 18444
All done 50 calls (plus 0 warmup) 2.603 ms avg, 10.0 qps
anya@pop-os:~/kmesh$ kubectl delete -f circuit-breaker.yaml && kubectl delete -f sample-app.yaml && kubectl delekubectl delete -f circuit-breaker.yaml && kubectl delete -f sample-app.yaml && kubectl delete -f fortio.yaml
destinationrule.networking.istio.io "httpbin-demo-cb" deleted
deployment.apps "httpbin-demo" deleted
service "httpbin-demo" deleted
deployment.apps "fortio-demo" deleted
service "fortio-demo" deleted

@kmesh-bot kmesh-bot added size/XL and removed size/L labels Jun 11, 2025
@DeshDeepakKant DeshDeepakKant force-pushed the add-circuit-breaker-guide branch from 92b1bb2 to 7c8a2db Compare June 11, 2025 05:22
@DeshDeepakKant DeshDeepakKant changed the title Add circuit breaker user guide docs: Add circuit breaker user guide Jun 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants