Requirements
- Target platform
- OpenClaw
- Install method
- Manual import
- Extraction
- Extract archive
- Prerequisites
- OpenClaw
- Primary doc
- SKILL.md
Query and analyze Tencent Cloud CLS logs
Query and analyze Tencent Cloud CLS logs
Hand the extracted package to your coding agent with a concrete install brief instead of figuring it out manually.
I downloaded a skill package from Yavira. Read SKILL.md from the extracted folder and install it by following the included instructions. Tell me what you changed and call out any manual steps you could not complete.
I downloaded an updated skill package from Yavira. Read SKILL.md from the extracted folder, compare it with my current installation, and upgrade it while preserving any custom configuration unless the package docs explicitly say otherwise. Summarize what changed and any follow-up checks I should run.
Query and analyze Tencent Cloud CLS logs.
Install clscli (Homebrew): brew tap dbwang0130/clscli brew install dbwang0130/clscli/clscli Get credentials and region list: https://cloud.tencent.com/document/api/614/56474 Set environment variables (same as Tencent Cloud API common parameters): export TENCENTCLOUD_SECRET_ID="your-secret-id" export TENCENTCLOUD_SECRET_KEY="your-secret-key" Specify region via --region (e.g. ap-guangzhou).
!IMPORTANT: If you do not know the log topic, list topics first.
List topics in a region to determine which --region and topic ID to use for query/context. clscli topics --region <region> [--topic-name name] [--logset-name name] [--logset-id id] [--limit 20] [--offset 0] Examples: --output=json, --output=csv, -o topics.csv OptionRequiredDescription--regionyesCLS region, e.g. ap-guangzhou--topic-namenoFilter by topic name (fuzzy match)--logset-namenoFilter by logset name (fuzzy match)--logset-idnoFilter by logset ID--limitnoPage size, default 20, max 100--offsetnoPagination offset, default 0--output, -onoOutput: json, csv, or file path Output columns: Region, TopicId, TopicName, LogsetId, CreateTime, StorageType.
clscli query -q "[query condition] | [SQL statement]" --region <region> -t <TopicId> --last 1h Examples: Time: --last 1h, --last 30m; or --from/--to (Unix ms) Multiple topics: --topics <id1>,<id2> or multiple -t <id> Auto pagination and cap: --max 5000 (paginate until 5000 logs or ListOver) Output: --output=json, --output=csv, -o result.json (write to file) OptionRequiredDescription--regionyesCLS region, e.g. ap-guangzhou-q, --queryyesQuery condition or SQL, e.g. level:ERROR or * | select count(*) as cnt-t, --topicone of -t/--topicsSingle log topic ID--topicsone of -t/--topicsComma-separated topic IDs, max 50--lastone of --last/--from/--toTime range, e.g. 1h, 30m, 24h--from, --toone of --last/--from/--toStart/end time (Unix ms)--limitnoLogs per request, default 100, max 1000--maxnoMax total logs; when non-zero, auto-paginate until reached or ListOver--output, -onoOutput: json, csv, or file path--sortnoSort: asc or desc, default desc Query condition syntax Two syntaxes are supported: CQL (CLS Query Language): CLS-specific query syntax for logs, easy to use, recommended. Lucene: Open-source Lucene syntax; not designed for log search, has more restrictions on special chars, case, wildcards; not recommended. CQL syntax SyntaxDescriptionkey:valueKey-value search; logs where field (key) contains value, e.g. level:ERRORvalueFull-text search; logs containing value, e.g. ERRORANDLogical AND, case-insensitive, e.g. level:ERROR AND pid:1234ORLogical OR, case-insensitive, e.g. level:ERROR OR level:WARNING, level:(ERROR OR WARNING)NOTLogical NOT, case-insensitive, e.g. level:ERROR NOT pid:1234, level:ERROR AND NOT pid:1234()Grouping for precedence, e.g. level:(ERROR OR WARNING) AND pid:1234. Note: AND has higher precedence than OR when no parentheses." "Phrase search; double-quoted string, words and order must match, e.g. name:"john Smith". No logical operators inside phrase.' 'Phrase search; single quotes, same as ""; use when phrase contains double quotes, e.g. body:'user_name:"bob"'*Wildcard; zero or more chars, e.g. host:www.test*.com. No prefix wildcard.>, >=, <, <=, =Range operators for numeric values, e.g. status>400, status:>=400\Escape; escaped char is literal. Escape space, :, (), >, =, <, ", ', * in values.key:*text: field exists (any value). long/double: field exists and is numeric, e.g. response_time:*key:""text: field exists and is empty. long/double: value is not numeric or field missing, e.g. response_time:"" SQL statement syntax SyntaxDescriptionSELECTSelect from table; data from current log topic matching query conditionASAlias for column (KEY)GROUP BYWith aggregate functions, group by one or more columns (KEY)ORDER BYSort result set by KEYLIMITLimit rows, default 100, max 1MWHEREFilter raw dataHAVINGFilter after GROUP BY, before ORDER BY; WHERE filters raw dataNested subqueryOne SELECT inside another for multi-step analysisSQL functionsRicher analysis: IP geo, time format, string split/join, JSON extract, math, distinct count, etc.
Retrieve log context around a given log. clscli context <PkgId> <PkgLogId> --region <region> -t <TopicId> Examples: --output=json, --output=csv, -o context.json (write to file) OptionRequiredTypeDescriptionExample--regionyesStringCLS regionap-guangzhou-t, --topicyesStringLog topic ID-PkgIdyesStringLog package ID, i.e. SearchLog Results[].PkgId528C1318606EFEB8-1A7PkgLogIdyesIntegerIndex within package, i.e. SearchLog Results[].PkgLogId65536--output, -ono-Output: json, csv, or file path-
Code helpers, APIs, CLIs, browser automation, testing, and developer operations.
Largest current source with strong distribution and engagement signals.