Skip to content

neverafraid1/aliyun-log-go-sdk

This branch is 6 commits ahead of, 12 commits behind aliyun/aliyun-log-go-sdk:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

04b9d0e · Nov 29, 2024
May 27, 2024
Jul 31, 2024
Nov 28, 2024
Nov 29, 2024
Nov 25, 2024
Feb 23, 2024
Aug 30, 2021
Apr 25, 2018
May 6, 2019
Nov 7, 2016
Mar 27, 2024
Feb 1, 2021
May 24, 2024
Jun 5, 2023
Jun 23, 2022
Dec 8, 2022
Jun 5, 2023
Jun 5, 2023
Dec 2, 2020
Dec 8, 2022
Oct 10, 2022
Dec 8, 2022
Jun 5, 2023
Jun 5, 2023
Nov 29, 2024
Jan 25, 2024
Dec 8, 2022
Jan 13, 2021
Jun 20, 2022
Jun 20, 2022
Sep 11, 2023
Oct 24, 2022
Dec 8, 2022
Oct 12, 2023
Dec 14, 2021
Dec 14, 2021
Dec 8, 2022
Dec 8, 2022
Aug 6, 2021
Dec 8, 2022
Nov 29, 2024
Aug 30, 2023
Jun 13, 2024
Jun 13, 2024
Sep 6, 2024
Feb 23, 2024
Jul 31, 2024
Aug 23, 2023
Sep 5, 2024
Sep 5, 2024
Sep 5, 2024
Dec 21, 2023
Sep 11, 2023
Sep 4, 2018
May 25, 2022
Sep 4, 2018
Sep 19, 2019
Sep 4, 2018
Sep 5, 2024
Sep 5, 2024
Nov 28, 2024
Jun 30, 2023
Apr 24, 2023
Aug 1, 2018
Apr 25, 2023
Apr 2, 2020
Jun 5, 2024
Nov 29, 2024
Dec 2, 2020
Aug 24, 2023
Sep 27, 2019
May 29, 2024
Jan 7, 2020
Nov 29, 2024
Nov 28, 2024
Mar 25, 2021
Jun 5, 2024
Dec 21, 2023
Sep 4, 2018
Oct 11, 2023
Dec 8, 2022
Dec 8, 2022
Dec 5, 2022
May 19, 2020
Nov 29, 2024
Dec 8, 2022
Nov 28, 2024
Nov 28, 2024

Repository files navigation

User Guide (中文)

README in English

基本介绍

本项目是阿里云日志服务 (Log Service,简称SLS)API的Golang编程接口,提供了对于Log Service Rest API的封装和实现,帮助Golang开发人员更快编程使用阿里云日志服务。

本项目主要由3个部分组成:

  1. 日志服务基础API封装和实现。
  2. Golang Producer Library,用于向日志服务批量发送数据,详情参考Aliyun LOG Golang Producer 快速入门
  3. Golang Consumer Library,用于消费日志服务中的数据,详情参考Consumer Library

详细API接口以及含义请参考:https://help.aliyun.com/document_detail/29007.html

安装

go get -u github.com/aliyun/aliyun-log-go-sdk

快速入门

前言: 所有的使用样例都位于example目录下,使用该目录下的所有样例前,请先在该目录下的config.go文件中的init函数中配置您的 project, logstore等所需要的配置参数,example目录下的所有样例都会使用config.go文件中配置的参数。

  1. 创建Client

    参考 config.go 文件

    AccessKeyID = "your ak id"
    AccessKeySecret = "your ak secret"
    credentialsProvider := NewStaticCredentialsProvider(AccessKeyID, AccessKeySecret, "")
    Endpoint = "your endpoint" // just like cn-hangzhou.log.aliyuncs.com
    Client = sls.CreateNormalInterfaceV2(Endpoint, credentialsProvider)

    为了防止出现配置错误,您可以在创建 Client 之后,测试 Client 是否能成功调用 SLS API

    _, err := Client.ListProject()
    if err != nil {
       panic(err)
    }
  2. 创建project

    参考 log_project.go文件

    project, err := util.Client.CreateProject(ProjectName,"Project used for testing")
    if err != nil {
       fmt.Println(err)
    }
    fmt.Println(project)
  3. 创建logstore

    参考 log_logstore.go

    err := util.Client.CreateLogStore(ProjectName, LogStoreName,2,2,true,64)
    if err != nil {
       fmt.Println(err)
    }
  4. 创建索引

    参考index_sample

    indexKeys := map[string]sls.IndexKey{
       "col_0": {
          Token:         []string{" "},
          CaseSensitive: false,
          Type:          "long",
       },
       "col_1": {
          Token:         []string{",", ":", " "},
          CaseSensitive: false,
          Type:          "text",
       },
    }
    index := sls.Index{
       Keys: indexKeys,
       Line: &sls.IndexLine{
          Token:         []string{",", ":", " "},
          CaseSensitive: false,
          IncludeKeys:   []string{},
          ExcludeKeys:   []string{},
       },
    }
    err = util.Client.CreateIndex(util.ProjectName, logstore_name, index)
    if err != nil {
       fmt.Printf("CreateIndex fail, err: %s", err)
       return
    }
    fmt.Println("CreateIndex success")
  5. 写数据

    这里展示了用sdk中原生的API接口去发送数据简单示例,但是我们不推荐用API直接向logstore写入数据,推荐使用SDK 中提供的producer 包向logstore 写入数据,自动压缩数据并且提供安全退出机制,不会使数据丢失, 对于MetricStore类型, 使用PutLogsWithMetricStoreURLAPI 发送数据可以提升大基数时间线下查询性能 。

    logs := []*sls.Log{}
    for logIdx := 0; logIdx < 100; logIdx++ {
       content := []*sls.LogContent{}
       for colIdx := 0; colIdx < 10; colIdx++ {
          content = append(content, &sls.LogContent{
             Key:   proto.String(fmt.Sprintf("col_%d", colIdx)),
             Value: proto.String(fmt.Sprintf("loggroup idx: %d, log idx: %d, col idx: %d, value: %d", loggroupIdx, logIdx, colIdx, rand.Intn(10000000))),
          })
       }
       log := &sls.Log{
          Time:     proto.Uint32(uint32(time.Now().Unix())),
          Contents: content,
       }
       logs = append(logs, log)
    }
    loggroup := &sls.LogGroup{
       Topic:  proto.String(""),
       Source: proto.String("10.230.201.117"),
       Logs:   logs,
    }
    // PostLogStoreLogs API Ref: https://intl.aliyun.com/help/doc-detail/29026.htm
    for retryTimes := 0; retryTimes < 10; retryTimes++ {
       err := util.Client.PutLogs(util.ProjectName, util.LogStoreName, loggroup)
       if err == nil {
          fmt.Printf("PutLogs success, retry: %d\n", retryTimes)
          break
       } else {
          fmt.Printf("PutLogs fail, retry: %d, err: %s\n", retryTimes, err)
          //handle exception here, you can add retryable erorrCode, set appropriate put_retry
          if strings.Contains(err.Error(), sls.WRITE_QUOTA_EXCEED) || strings.Contains(err.Error(), sls.PROJECT_QUOTA_EXCEED) || strings.Contains(err.Error(), sls.SHARD_WRITE_QUOTA_EXCEED) {
             //mayby you should split shard
             time.Sleep(1000 * time.Millisecond)
          } else if strings.Contains(err.Error(), sls.INTERNAL_SERVER_ERROR) || strings.Contains(err.Error(), sls.SERVER_BUSY) {
             time.Sleep(200 * time.Millisecond)
          }
       }
    }

6.读数据

这里展示了使用SDK中原生API接口调用去拉取数据的方式,我们不推荐使用这种方式去读取消费logstore中的数据,推荐使用SDK中 consumer 消费组去拉取数据,消费组提供自动负载均衡以及失败重试等机制,并且会自动保存拉取断点,再次拉取不会拉取重复数据。

shards, err := client.ListShards(project, logstore)

totalLogCount := 0
	for _, shard := range shards {
		fmt.Printf("[shard] %d begin\n", shard.ShardID)
		beginCursor, err := client.GetCursor(project, logstore, shard.ShardID, "begin")
		if err != nil {
			panic(err)
		}
		endCursor, err := client.GetCursor(project, logstore, shard.ShardID, "end")
		if err != nil {
			panic(err)
		}

		nextCursor := beginCursor
		for nextCursor != endCursor {
			gl, nc, err := client.PullLogs(project, logstore, shard.ShardID, nextCursor, endCursor, 10)
			if err != nil {
				fmt.Printf("pull log error : %s\n", err)
				time.Sleep(time.Second)
				continue
			}
			nextCursor = nc
			fmt.Printf("now count %d \n", totalLogCount)
			if gl != nil {
				for _, lg := range gl.LogGroups {
					for _, tag := range lg.LogTags {
						fmt.Printf("[tag] %s : %s\n", tag.GetKey(), tag.GetValue())
					}
					for _, log := range lg.Logs {
						totalLogCount++
						// print log
						for _, content := range log.Contents {
							fmt.Printf("[log] %s : %s\n", content.GetKey(), content.GetValue())
						}
					}
				}
			}
		}
  1. 创建机器组

    参考 machine_group_sample.go

机器组分为IP型机器组和标识型机器组,二选一

创建标识型机器组

attribute := sls.MachinGroupAttribute{
}
machineList := []string{"mac-user-defined-id-value_XX"}
var machineGroup = &sls.MachineGroup{
   Name:          groupName,
   MachineIDType: "userdefined",
   MachineIDList: machineList,
   Attribute:     attribute,
}
err = util.Client.CreateMachineGroup(ProjectName, machineGroup)
if err != nil {
   fmt.Println(err)
}

创建IP型机器组

attribute := sls.MachinGroupAttribute{
}
machineList := []string{"192.168.XX.XX"}
var machineGroup = &sls.MachineGroup{
   Name:          groupName,
   MachineIDType: "ip",
   MachineIDList: machineList,
   Attribute:     attribute,
}
err = util.Client.CreateMachineGroup(ProjectName, machineGroup)
if err != nil {
   fmt.Println(err)
}
  1. 创建logtail 采集配置

    logtail 采集配置,目前通过sdk 支持创建下列几种模式的采集配置,分别为 完整正则分隔符模式json模式插件模式,这里展示的完整正则模式的创建。

    regexConfig := new(sls.RegexConfigInputDetail)
    regexConfig.DiscardUnmatch = false
    regexConfig.Key = []string{"logger", "time", "cluster", "hostname", "sr", "app", "workdir", "exe", "corepath", "signature", "backtrace"}
    regexConfig.Regex = "\\S*\\s+(\\S*)\\s+(\\S*\\s+\\S*)\\s+\\S*\\s+(\\S*)\\s+(\\S*)\\s+(\\S*)\\s+(\\S*)\\s+(\\S*)\\s+(\\S*)\\s+(\\S*)\\s+\\S*\\s+(\\S*)\\s*([^$]+)"
    regexConfig.TimeFormat = "%Y/%m/%d %H:%M:%S"
    regexConfig.LogBeginRegex = `INFO core_dump_info_data .*`
    regexConfig.LogPath = "/cloud/log/tianji/TianjiClient#/core_dump_manager"
    regexConfig.FilePattern = "core_dump_info_data.log*"
    regexConfig.MaxDepth = 0
    sls.InitRegexConfigInputDetail(regexConfig)
    outputDetail := sls.OutputDetail{
       ProjectName:  projectName,
       LogStoreName: logstore,
    }
    logConfig := &sls.LogConfig{
       Name:         configName,
       InputType:    "file",
       OutputType:   "LogService", // Now only supports LogService
       InputDetail:  regexConfig,
       OutputDetail: outputDetail,
    }
    err = util.Client.CreateConfig(projectName, logConfig)
    if err != nil {
       fmt.Println(err)
    }
  2. logtail配置到机器组

    err = util.Client.ApplyConfigToMachineGroup(ProjectName, confName, mgname)
    if err != nil {
       fmt.Println(err)
    }

开发者

更新protobuf工具

protoc -I=. -I=$GOPATH/src -I=$GOPATH/src/github.com/gogo/protobuf/protobuf --gofast_out=. log.proto

About

go loghub sdk

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 99.9%
  • Makefile 0.1%