Sentry 监控 - Snuba 数据中台本地开发环境配置实战

本文涉及的产品
云数据库 Redis 版,社区版 2GB
推荐场景:
搭建游戏排行榜
服务治理 MSE Sentinel/OpenSergo,Agent数量 不受限
简介: Sentry 监控 - Snuba 数据中台本地开发环境配置实战

克隆仓库



分别克隆 getsentry/sentrygetsentry/snuba


git clone https://github.com/getsentry/sentry.git
git clone https://github.com/getsentry/snuba.git


安装系统依赖(以 Mac 为例)



Xcode CLI tools


xcode-select --install


Brewfile


进入 sentry 文件夹,你会看到一个 Brewfile 文件:


cd sentry


Brewfile


# required to run devservices
cask 'docker'
brew 'pyenv'
# required for pyenv's python-build
brew 'openssl'
brew 'readline'
# required for yarn test -u
brew 'watchman'
# required to build some of sentry's dependencies
brew 'pkgconfig'
brew 'libxslt'
brew 'libxmlsec1'
brew 'geoip'
# Currently needed because on Big Sur there's no wheel for it
brew 'librdkafka'
# direnv isn't defined here, because we have it configured to check for a bootstrapped environment.
# If it's installed in the early steps of the setup process, it just leads to confusion.
# brew 'direnv'
tap 'homebrew/cask'
# required for acceptance testing
cask 'chromedriver'


如果你本地已经安装了 Docker Desktop 并且已经启动,可以把 cask 'docker' 注释掉。


接下来,运行:


brew bundle --verbose


如果你之前本地没有 Docker Desktop,则还需要手动启动一下它:


open -g -a Docker.app


构建工具链


Sentry 依赖于 Python Wheels(包含二进制扩展模块的包),官方为以下平台分发:


  • Linux 兼容 PEP-513 (manylinux1)
  • macOS 10.15 或更高版本

如果您的开发机器没有运行上述系统之一,则需要安装 Rust 工具链。按照 https://www.rust-lang.org/tools/install 上的说明安装编译器和相关工具。安装后,Sentry 安装程序将自动使用 Rust 构建所有二进制模块,无需额外配置。


官方通常会跟踪最新的稳定 Rust 版本,该版本每六周更新一次。因此,请确保通过偶尔运行来使您的 Rust 工具链保持最新:


rustup update stable


Python



Sentry 使用 pyenv 来安装和管理 Python 版本。它是在您运行 brew bundle 时安装的。


要安装所需版本的 Python,您需要运行以下命令。这将需要一段时间,因为您的计算机实际上正在编译 Python


make setup-pyenv


这里假设你是 Zsh 用户。


如果您键入 which python,您应该看到类似 $HOME/.pyenv/shims/python 而不是 /usr/bin/python 的内容。这是因为以下内容已添加到您的启动脚本中:

cat ~/.zprofile,你会看到如下内容:


# MacPorts Installer addition on 2021-10-20_at_11:48:22: adding an appropriate PATH variable for use with MacPorts.
export PATH="/opt/local/bin:/opt/local/sbin:$PATH"
# Finished adapting your PATH environment variable for use with MacPorts.
# It is assumed that pyenv is installed via Brew, so this is all we need to do.
eval "$(pyenv init --path)"


虚拟环境


您现在已准备好创建 Python 虚拟环境。运行:


python -m venv .venv


并激活虚拟环境:


source .venv/bin/activate


如果一切正常,运行 which python 现在应该会导致类似 /Users/you/sentry/.venv/bin/python 的结果。


Snuba 配置实战



启动 Snuba 相关依赖项容器


cd ../sentry
git checkout master
git pull
source .venv/bin/activate
sentry devservices up --exclude=snuba
# 11:17:59 [WARNING] sentry.utils.geo: settings.GEOIP_PATH_MMDB not configured.
# 11:18:01 [INFO] sentry.plugins.github: apps-not-configured
# > Pulling image 'postgres:9.6-alpine'
# > Pulling image 'yandex/clickhouse-server:20.3.9.70'
# > Not starting container 'sentry_relay' because it should be started on-demand with devserver.
# > Creating 'sentry_redis' volume
# > Creating 'sentry_zookeeper_6' volume
# > Creating 'sentry_kafka_6' volume
# > Creating container 'sentry_redis'
# > Creating container 'sentry_zookeeper'
# > Creating container 'sentry_kafka'
# > Starting container 'sentry_redis' (listening: ('127.0.0.1', 6379))
# > Starting container 'sentry_kafka' (listening: ('127.0.0.1', 9092))
# > Starting container 'sentry_zookeeper'
# > Creating 'sentry_clickhouse' volume
# > Creating container 'sentry_clickhouse'
# > Creating 'sentry_postgres' volume
# > Creating 'sentry_wal2json' volume
# > Starting container 'sentry_clickhouse' (listening: ('127.0.0.1', 9000), ('127.0.0.1', 9009), ('127.0.0.1', 8123))
# > Creating container 'sentry_postgres'
# > Starting container 'sentry_postgres' (listening: ('127.0.0.1', 5432))


这将在 master 上获取最新版本的 Sentry,并调出所有 snuba 的依赖项。

Snuba 主要依赖 clickhousezookeeperkafkaredis 相关容器。

docker ps 查看一下:


1149a6f6ff23   postgres:9.6-alpine                  "docker-entrypoint.s…"   3 minutes ago   Up 3 minutes   127.0.0.1:5432->5432/tcp                                                       sentry_postgres
a7f3af7d52bb   yandex/clickhouse-server:20.3.9.70   "/entrypoint.sh"         3 minutes ago   Up 3 minutes   127.0.0.1:8123->8123/tcp, 127.0.0.1:9000->9000/tcp, 127.0.0.1:9009->9009/tcp   sentry_clickhouse
68913ee15c43   confluentinc/cp-zookeeper:6.2.0      "/etc/confluent/dock…"   3 minutes ago   Up 3 minutes   2181/tcp, 2888/tcp, 3888/tcp                                                   sentry_zookeeper
5a248eb26ed3   confluentinc/cp-kafka:6.2.0          "/etc/confluent/dock…"   3 minutes ago   Up 3 minutes   127.0.0.1:9092->9092/tcp                                                       sentry_kafka
0573aff7b5af   redis:5.0-alpine                     "docker-entrypoint.s…"   3 minutes ago   Up 3 minutes   127.0.0.1:6379->6379/tcp                                                       sentry_redis


设置 Snuba 虚拟环境


cd snuba
make pyenv-setup
python -m venv .venv
source .venv/bin/activate
pip install --upgrade pip==21.1.3
make develop


查看迁移列表


snuba migrations list
#  system
#  [ ]  0001_migrations
#
#  events
#  [ ]  0001_events_initial
#  [ ]  0002_events_onpremise_compatibility
#  [ ]  0003_errors
#  [ ]  0004_errors_onpremise_compatibility
#  [ ]  0005_events_tags_hash_map (blocking)
#  [ ]  0006_errors_tags_hash_map (blocking)
#  [ ]  0007_groupedmessages
#  [ ]  0008_groupassignees
#  [ ]  0009_errors_add_http_fields
#  [ ]  0010_groupedmessages_onpremise_compatibility (blocking)
#  [ ]  0011_rebuild_errors
#  [ ]  0012_errors_make_level_nullable
#  [ ]  0013_errors_add_hierarchical_hashes
#  [ ]  0014_backfill_errors (blocking)
#  [ ]  0015_truncate_events
#
#  transactions
#  [ ]  0001_transactions
#  [ ]  0002_transactions_onpremise_fix_orderby_and_partitionby (blocking)
#  [ ]  0003_transactions_onpremise_fix_columns (blocking)
#  [ ]  0004_transactions_add_tags_hash_map (blocking)
#  [ ]  0005_transactions_add_measurements
#  [ ]  0006_transactions_add_http_fields
#  [ ]  0007_transactions_add_discover_cols
#  [ ]  0008_transactions_add_timestamp_index
#  [ ]  0009_transactions_fix_title_and_message
#  [ ]  0010_transactions_nullable_trace_id
#  [ ]  0011_transactions_add_span_op_breakdowns
#  [ ]  0012_transactions_add_spans
#
#  discover
#  [ ]  0001_discover_merge_table
#  [ ]  0002_discover_add_deleted_tags_hash_map
#  [ ]  0003_discover_fix_user_column
#  [ ]  0004_discover_fix_title_and_message
#  [ ]  0005_discover_fix_transaction_name
#  [ ]  0006_discover_add_trace_id
#  [ ]  0007_discover_add_span_id
#
#  outcomes
#  [ ]  0001_outcomes
#  [ ]  0002_outcomes_remove_size_and_bytes
#  [ ]  0003_outcomes_add_category_and_quantity
#  [ ]  0004_outcomes_matview_additions (blocking)
#
#  metrics
#  [ ]  0001_metrics_buckets
#  [ ]  0002_metrics_sets
#  [ ]  0003_counters_to_buckets
#  [ ]  0004_metrics_counters
#  [ ]  0005_metrics_distributions_buckets
#  [ ]  0006_metrics_distributions
#  [ ]  0007_metrics_sets_granularity_10
#  [ ]  0008_metrics_counters_granularity_10
#  [ ]  0009_metrics_distributions_granularity_10
#  [ ]  0010_metrics_sets_granularity_1h
#  [ ]  0011_metrics_counters_granularity_1h
#  [ ]  0012_metrics_distributions_granularity_1h
#  [ ]  0013_metrics_sets_granularity_1d
#  [ ]  0014_metrics_counters_granularity_1d
#  [ ]  0015_metrics_distributions_granularity_1d
#
#  sessions
#  [ ]  0001_sessions
#  [ ]  0002_sessions_aggregates
#  [ ]  0003_sessions_matview


运行迁移


snuba migrations migrate --force
#  ......
#  2021-12-01 19:45:57,557 Running migration: 0014_metrics_counters_granularity_1d
#  2021-12-01 19:45:57,575 Finished: 0014_metrics_counters_granularity_1d
#  2021-12-01 19:45:57,589 Running migration: 0015_metrics_distributions_granularity_1d
#  2021-12-01 19:45:57,610 Finished: 0015_metrics_distributions_granularity_1d
#  2021-12-01 19:45:57,623 Running migration: 0001_sessions
#  2021-12-01 19:45:57,656 Finished: 0001_sessions
#  2021-12-01 19:45:57,669 Running migration: 0002_sessions_aggregates
#  2021-12-01 19:45:57,770 Finished: 0002_sessions_aggregates
#  2021-12-01 19:45:57,792 Running migration: 0003_sessions_matview
#  2021-12-01 19:45:57,849 Finished: 0003_sessions_matview
#  Finished running migrations


检查迁移


进入 Clickhouse 容器:


docker exec -it sentry_clickhouse clickhouse-client
# 运行如下 `sql` 语句:
select count() from sentry_local
# ClickHouse client version 20.3.9.70 (official build).
# Connecting to localhost:9000 as user default.
# Connected to ClickHouse server version 20.3.9 revision 54433.
# a7f3af7d52bb :) select count() from sentry_local
# SELECT count()
# FROM sentry_local
# ┌─count()─┐
# │       0 │
# └─────────┘
# 1 rows in set. Elapsed: 0.008 sec. 
# a7f3af7d52bb :)


查看相关实体数据集


snuba entities list
# Declared Entities:
# discover
# events
# groups
# groupassignee
# groupedmessage
# metrics_sets
# metrics_counters
# metrics_distributions
# outcomes
# outcomes_raw
# sessions
# org_sessions
# spans
# transactions
# discover_transactions
# discover_events


启动开发服务器


此命令将启动 api 和所有 Snuba 消费者以从 Kafka 摄取数据:


snuba devserver


转到 http://localhost:1218/events/snql,你将会看到一个简易的查询 UI


微信图片_20220613003330.png

相关实践学习
使用CLup和iSCSI共享盘快速体验PolarDB for PostgtreSQL
在Clup云管控平台中快速体验创建与管理在iSCSI共享盘上的PolarDB for PostgtreSQL。
AnalyticDB PostgreSQL 企业智能数据中台:一站式管理数据服务资产
企业在数据仓库之上可构建丰富的数据服务用以支持数据应用及业务场景;ADB PG推出全新企业智能数据平台,用以帮助用户一站式的管理企业数据服务资产,包括创建, 管理,探索, 监控等; 助力企业在现有平台之上快速构建起数据服务资产体系
相关文章
|
5月前
|
监控 Ubuntu Docker
Sentry 监控 Docker 方式部署
Sentry 监控 Docker 方式部署
147 0
|
5月前
|
监控 前端开发 JavaScript
Sentry 监控部署与使用(详细流程)
Sentry 监控部署与使用(详细流程)
227 0
|
7天前
|
人工智能 自然语言处理 开发者
Langchain 与 Elasticsearch:创新数据检索的融合实战
Langchain 与 Elasticsearch:创新数据检索的融合实战
30 10
|
21天前
|
存储 人工智能 数据库
【AI大模型应用开发】以LangChain为例:从短期记忆实战,到如何让AI应用保持长期记忆的探索
【AI大模型应用开发】以LangChain为例:从短期记忆实战,到如何让AI应用保持长期记忆的探索
40 0
|
21天前
|
人工智能 API
【AI大模型应用开发】【LangChain系列】实战案例6:利用大模型进行文本总结的方法探索,文本Token超限怎么办?
【AI大模型应用开发】【LangChain系列】实战案例6:利用大模型进行文本总结的方法探索,文本Token超限怎么办?
34 0
|
21天前
|
人工智能
【AI大模型应用开发】【LangChain系列】实战案例5:用LangChain实现灵活的Agents+RAG,该查时查,不该查时就别查
【AI大模型应用开发】【LangChain系列】实战案例5:用LangChain实现灵活的Agents+RAG,该查时查,不该查时就别查
53 0
|
21天前
|
数据采集 存储 人工智能
【AI大模型应用开发】【LangChain系列】实战案例4:再战RAG问答,提取在线网页数据,并返回生成答案的来源
【AI大模型应用开发】【LangChain系列】实战案例4:再战RAG问答,提取在线网页数据,并返回生成答案的来源
61 0
|
21天前
|
存储 人工智能 API
【AI大模型应用开发】【LangChain系列】实战案例3:深入LangChain源码,你不知道的WebResearchRetriever与RAG联合之力
【AI大模型应用开发】【LangChain系列】实战案例3:深入LangChain源码,你不知道的WebResearchRetriever与RAG联合之力
38 0
|
21天前
|
数据采集 存储 人工智能
【AI大模型应用开发】【LangChain系列】实战案例2:通过URL加载网页内容 - LangChain对爬虫功能的封装
【AI大模型应用开发】【LangChain系列】实战案例2:通过URL加载网页内容 - LangChain对爬虫功能的封装
34 0
|
21天前
|
人工智能 Python
【AI大模型应用开发】【LangChain系列】实战案例1:用LangChain写Python代码并执行来生成答案
【AI大模型应用开发】【LangChain系列】实战案例1:用LangChain写Python代码并执行来生成答案
35 0