Skip to content

Commit fb180b0

Browse files
authored
feat: add audit-comet-expression Claude Code skill (#3793)
1 parent 6855f2b commit fb180b0

1 file changed

Lines changed: 331 additions & 0 deletions

File tree

  • .claude/skills/audit-comet-expression
Lines changed: 331 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,331 @@
1+
---
2+
name: audit-comet-expression
3+
description: Audit an existing Comet expression for correctness and test coverage. Studies the Spark implementation across versions 3.4.3, 3.5.8, and 4.0.1, reviews the Comet and DataFusion implementations, identifies missing test coverage, and offers to implement additional tests.
4+
argument-hint: <expression-name>
5+
---
6+
7+
Audit the Comet implementation of the `$ARGUMENTS` expression for correctness and test coverage.
8+
9+
## Overview
10+
11+
This audit covers:
12+
13+
1. Spark implementation across versions 3.4.3, 3.5.8, and 4.0.1
14+
2. Comet Scala serde implementation
15+
3. Comet Rust / DataFusion implementation
16+
4. Existing test coverage (SQL file tests and Scala tests)
17+
5. Gap analysis and test recommendations
18+
19+
---
20+
21+
## Step 1: Locate the Spark Implementations
22+
23+
Clone specific Spark version tags (use shallow clones to avoid polluting the workspace). Only clone a version if it is not already present.
24+
25+
```bash
26+
set -eu -o pipefail
27+
for tag in v3.4.3 v3.5.8 v4.0.1; do
28+
dir="/tmp/spark-${tag}"
29+
if [ ! -d "$dir" ]; then
30+
git clone --depth 1 --branch "$tag" https://github.com/apache/spark.git "$dir"
31+
fi
32+
done
33+
```
34+
35+
### Find the expression class in each Spark version
36+
37+
Search the Catalyst SQL expressions source:
38+
39+
```bash
40+
for tag in v3.4.3 v3.5.8 v4.0.1; do
41+
dir="/tmp/spark-${tag}"
42+
echo "=== $tag ==="
43+
find "$dir/sql/catalyst/src/main/scala" -name "*.scala" | \
44+
xargs grep -l "case class $ARGUMENTS\b\|object $ARGUMENTS\b" 2>/dev/null
45+
done
46+
```
47+
48+
If the expression is not found in catalyst, also check core:
49+
50+
```bash
51+
for tag in v3.4.3 v3.5.8 v4.0.1; do
52+
dir="/tmp/spark-${tag}"
53+
echo "=== $tag ==="
54+
find "$dir/sql" -name "*.scala" | \
55+
xargs grep -l "case class $ARGUMENTS\b\|object $ARGUMENTS\b" 2>/dev/null
56+
done
57+
```
58+
59+
### Read the Spark source for each version
60+
61+
For each Spark version, read the expression file and note:
62+
63+
- The `eval`, `nullSafeEval`, and `doGenCode` / `doGenCodeSafe` methods
64+
- The `inputTypes` and `dataType` fields (accepted input types, return type)
65+
- Null handling strategy (`nullable`, `nullSafeEval`)
66+
- ANSI mode behavior (`ansiEnabled`, `failOnError`)
67+
- Special cases, guards, `require` assertions, and runtime exceptions
68+
- Any constants or configuration the expression reads
69+
70+
### Compare across Spark versions
71+
72+
Produce a concise diff summary of what changed between:
73+
74+
- 3.4.3 → 3.5.8
75+
- 3.5.8 → 4.0.1
76+
77+
Pay attention to:
78+
79+
- New input types added or removed
80+
- Behavior changes for edge cases (null, overflow, empty, boundary)
81+
- New ANSI mode branches
82+
- New parameters or configuration
83+
- Breaking API changes that Comet must shim
84+
85+
---
86+
87+
## Step 2: Locate the Spark Tests
88+
89+
```bash
90+
for tag in v3.4.3 v3.5.8 v4.0.1; do
91+
dir="/tmp/spark-${tag}"
92+
echo "=== $tag ==="
93+
find "$dir/sql" -name "*.scala" -path "*/test/*" | \
94+
xargs grep -l "$ARGUMENTS" 2>/dev/null
95+
done
96+
```
97+
98+
Read the relevant Spark test files and produce a list of:
99+
100+
- Input types covered
101+
- Edge cases exercised (null, empty, overflow, negative, boundary values, special characters, etc.)
102+
- ANSI mode tests
103+
- Error cases
104+
105+
This list will be the reference for the coverage gap analysis in Step 5.
106+
107+
---
108+
109+
## Step 3: Locate the Comet Implementation
110+
111+
### Scala serde
112+
113+
```bash
114+
# Find the serde object
115+
grep -r "$ARGUMENTS" spark/src/main/scala/org/apache/comet/serde/ --include="*.scala" -l
116+
grep -r "$ARGUMENTS" spark/src/main/scala/org/apache/comet/ --include="*.scala" -l
117+
```
118+
119+
Read the serde implementation and check:
120+
121+
- Which Spark versions the serde handles
122+
- Whether `getSupportLevel` is implemented and accurate
123+
- Whether all input types are handled
124+
- Whether any types are explicitly marked `Unsupported`
125+
126+
### Shims
127+
128+
```bash
129+
find spark/src/main -name "CometExprShim.scala" | xargs grep -l "$ARGUMENTS" 2>/dev/null
130+
```
131+
132+
If shims exist, read them and note any version-specific handling.
133+
134+
### Rust / DataFusion implementation
135+
136+
```bash
137+
# Search for the function in native/spark-expr
138+
grep -r "$ARGUMENTS" native/spark-expr/src/ --include="*.rs" -l
139+
grep -r "$ARGUMENTS" native/core/src/ --include="*.rs" -l
140+
```
141+
142+
If the expression delegates to DataFusion, find it there too. Set `$DATAFUSION_SRC` to a local DataFusion checkout, or fall back to searching the cargo registry:
143+
144+
```bash
145+
if [ -n "${DATAFUSION_SRC:-}" ]; then
146+
grep -r "$ARGUMENTS" "$DATAFUSION_SRC" --include="*.rs" -l 2>/dev/null | head -10
147+
else
148+
# Fall back to cargo registry (may include unrelated crates)
149+
grep -r "$ARGUMENTS" ~/.cargo/registry/src/*/datafusion* --include="*.rs" -l 2>/dev/null | head -10
150+
fi
151+
```
152+
153+
Read the Rust implementation and check:
154+
155+
- Null handling (does it propagate nulls correctly?)
156+
- Overflow and underflow handling (returns `Err` vs panics)
157+
- Type dispatch (does it handle all types that Spark supports?)
158+
- ANSI / fail-on-error mode
159+
160+
---
161+
162+
## Step 4: Locate Existing Comet Tests
163+
164+
### SQL file tests
165+
166+
```bash
167+
# Find SQL test files for this expression
168+
find spark/src/test/resources/sql-tests/expressions/ -name "*.sql" | \
169+
xargs grep -l "$ARGUMENTS" 2>/dev/null
170+
171+
# Also check if there's a dedicated file
172+
find spark/src/test/resources/sql-tests/expressions/ -name "*$(echo $ARGUMENTS | tr '[:upper:]' '[:lower:]')*"
173+
```
174+
175+
Read every SQL test file found and list:
176+
177+
- Table schemas and data values used
178+
- Queries exercised
179+
- Query modes used (`query`, `spark_answer_only`, `tolerance`, `ignore`, `expect_error`)
180+
- Any ConfigMatrix directives
181+
182+
### Scala tests
183+
184+
```bash
185+
grep -r "$ARGUMENTS" spark/src/test/scala/ --include="*.scala" -l
186+
```
187+
188+
Read the relevant Scala test files and list:
189+
190+
- Input types covered
191+
- Edge cases exercised
192+
- Whether constant folding is disabled for literal tests
193+
194+
---
195+
196+
## Step 5: Gap Analysis
197+
198+
Compare the Spark test coverage (Step 2) against the Comet test coverage (Step 4). Produce a structured gap report:
199+
200+
### Coverage matrix
201+
202+
For each of the following dimensions, note whether it is covered in Comet tests or missing:
203+
204+
| Dimension | Spark tests it | Comet SQL test | Comet Scala test | Gap? |
205+
| ------------------------------------------------------------------------------------------------------ | -------------- | -------------- | ---------------- | ---- |
206+
| Column reference argument(s) | | | | |
207+
| Literal argument(s) | | | | |
208+
| NULL input | | | | |
209+
| Empty string / empty array / empty map | | | | |
210+
| Array/map with NULL elements | | | | |
211+
| Zero, negative zero, negative values (numeric) | | | | |
212+
| Underflow, overflow | | | | |
213+
| Boundary values (INT_MIN, INT_MAX, Long.MinValue, minimum positive, etc.) | | | | |
214+
| NaN, Infinity, -Infinity, subnormal (float/double) | | | | |
215+
| Multibyte / special UTF-8 (composed vs decomposed, e.g. `é` U+00E9 vs `e` + U+0301, non-Latin scripts) | | | | |
216+
| ANSI mode (failOnError=true) | | | | |
217+
| Non-ANSI mode (failOnError=false) | | | | |
218+
| All supported input types | | | | |
219+
| Parquet dictionary encoding (ConfigMatrix) | | | | |
220+
| Cross-version behavior differences | | | | |
221+
222+
### Implementation gaps
223+
224+
Also review the Comet implementation (Step 3) against the Spark behavior (Step 1):
225+
226+
- Are there input types that Spark supports but `getSupportLevel` returns `Unsupported` without comment?
227+
- Are there behavioral differences that are NOT marked `Incompatible` but should be?
228+
- Are there behavioral differences between Spark versions that the Comet implementation does not account for (missing shim)?
229+
- Does the Rust implementation match the Spark behavior for all edge cases?
230+
231+
---
232+
233+
## Step 6: Recommendations
234+
235+
Summarize findings as a prioritized list.
236+
237+
### High priority
238+
239+
Issues where Comet may silently produce wrong results compared to Spark.
240+
241+
### Medium priority
242+
243+
Missing test coverage for edge cases that could expose bugs.
244+
245+
### Low priority
246+
247+
Minor gaps, cosmetic improvements, or nice-to-have tests.
248+
249+
---
250+
251+
## Step 7: Offer to Implement Missing Tests
252+
253+
After presenting the gap analysis, ask the user:
254+
255+
> I found the following missing test cases. Would you like me to implement them?
256+
>
257+
> - [list each missing test case]
258+
>
259+
> I can add them as SQL file tests in `spark/src/test/resources/sql-tests/expressions/<category>/$ARGUMENTS.sql`
260+
> (or as Scala tests in `CometExpressionSuite` for cases that require programmatic setup).
261+
262+
If the user says yes, implement the missing tests following the SQL file test format described in
263+
`docs/source/contributor-guide/sql-file-tests.md`. Prefer SQL file tests over Scala tests.
264+
265+
### SQL file test template
266+
267+
```sql
268+
-- Licensed to the Apache Software Foundation (ASF) under one
269+
-- or more contributor license agreements. See the NOTICE file
270+
-- distributed with this work for additional information
271+
-- regarding copyright ownership. The ASF licenses this file
272+
-- to you under the Apache License, Version 2.0 (the
273+
-- "License"); you may not use this file except in compliance
274+
-- with the License. You may obtain a copy of the License at
275+
--
276+
-- http://www.apache.org/licenses/LICENSE-2.0
277+
--
278+
-- Unless required by applicable law or agreed to in writing,
279+
-- software distributed under the License is distributed on an
280+
-- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
281+
-- KIND, either express or implied. See the License for the
282+
-- specific language governing permissions and limitations
283+
-- under the License.
284+
285+
-- ConfigMatrix: parquet.enable.dictionary=false,true
286+
287+
statement
288+
CREATE TABLE test_$ARGUMENTS(...) USING parquet
289+
290+
statement
291+
INSERT INTO test_$ARGUMENTS VALUES
292+
(...),
293+
(NULL)
294+
295+
-- column argument
296+
query
297+
SELECT $ARGUMENTS(col) FROM test_$ARGUMENTS
298+
299+
-- literal arguments
300+
query
301+
SELECT $ARGUMENTS('value'), $ARGUMENTS(''), $ARGUMENTS(NULL)
302+
```
303+
304+
### Verify the tests pass
305+
306+
After implementing tests, tell the user how to run them:
307+
308+
```bash
309+
./mvnw test -DwildcardSuites="CometSqlFileTestSuite" -Dsuites="org.apache.comet.CometSqlFileTestSuite $ARGUMENTS" -Dtest=none
310+
```
311+
312+
---
313+
314+
## Output Format
315+
316+
Present the audit as:
317+
318+
1. **Expression Summary** - Brief description of what `$ARGUMENTS` does, its input/output types, and null behavior
319+
2. **Spark Version Differences** - Summary of any behavioral or API differences across Spark 3.4.3, 3.5.8, and 4.0.1
320+
3. **Comet Implementation Notes** - Summary of how Comet implements this expression and any concerns
321+
4. **Coverage Gap Analysis** - The gap table from Step 5, plus implementation gaps
322+
5. **Recommendations** - Prioritized list from Step 6
323+
6. **Offer to add tests** - The prompt from Step 7
324+
325+
## Tone and Style
326+
327+
- Write in clear, concise prose
328+
- Use backticks around code references (function names, file paths, class names, types, config keys)
329+
- Avoid robotic or formulaic language
330+
- Be constructive and acknowledge what is already well-covered before raising gaps
331+
- Avoid em dashes and semicolons; use separate sentences instead

0 commit comments

Comments
 (0)