What you’ll build
A repeatable workflow that:
- captures signals from SQL Server (DMVs) and SolarWinds DPA,
- drops those files into a VS Code workspace, and
- uses Amazon Q inside VS Code to analyze bottlenecks, explain plans, and propose fixes.
Optionally, you’ll publish those artifacts to an S3-backed knowledge base so Amazon Q can answer questions over time.
Prerequisites
- VS Code with the Amazon Q extension installed and authenticated (Builder ID for the free tier or IAM Identity Center for Pro). In VS Code, install “Amazon Q” from the marketplace, click the Q icon in the Activity Bar, and sign in. Visual Studio MarketplaceAmazon Web Services, Inc.AWS Documentation+1
- Access to SolarWinds Database Performance Analyzer (DPA) or at least the DPA reports you need to export. SolarWinds
- SSMS or Azure Data Studio with permissions to run DMVs on the target SQL Server.
- Optional for long-term Q&A: an AWS account with Amazon Bedrock Knowledge Bases or Amazon Q Business to index S3 documents. AWS Documentation+3AWS Documentation+3AWS Documentation+3
Folder layout in VS Code
Create a workspace folder, for example sql-perf-investigation/:
sql-perf-investigation/
dmv/
dpa/
screenshots/
prompts/
README.md
Amazon Q works best when all the context files sit in the workspace it can read. You’ll drop CSVs, JSON, and exported reports here, then chat with Q right in the editor. AWS Documentation
Step 1 — Install and sign in to Amazon Q in VS Code
- In VS Code, install Amazon Q from the Marketplace.
- Click the Q icon in the Activity Bar and sign in with Builder ID (free) or IAM Identity Center (Pro). If your org uses Pro, choose Company account and enter the Start URL provided by your admin. Visual Studio MarketplaceAWS Documentation
That’s it. You’ll see a chat panel and can ask Q to help, summarize files, and reason over workspace content. AWS Documentation
Step 2 — Collect SQL Server signals with DMVs
Open SSMS and run the following. Save each result as CSV into ./dmv/ (SSMS: Results to Grid → Save Results As…).
A) Top queries by CPU, reads, and duration
-- Top statements by average CPU over the last plan cache
SELECT TOP 20
DB_NAME(qt.dbid) AS database_name,
qs.total_worker_time / NULLIF(qs.execution_count,0) AS avg_cpu_ns,
qs.total_elapsed_time / NULLIF(qs.execution_count,0) AS avg_duration_ns,
qs.total_logical_reads / NULLIF(qs.execution_count,0) AS avg_reads,
qs.execution_count,
qs.last_execution_time,
SUBSTRING(qt.text, (qs.statement_start_offset/2)+1,
CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE (qs.statement_end_offset - qs.statement_start_offset)
END / 2) AS statement_text,
qs.plan_handle
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt
ORDER BY avg_cpu_ns DESC;
This relies on sys.dm_exec_query_stats, which aggregates performance for cached plans. Microsoft Learn
B) Waits snapshot (what the server is waiting on)
— High-impact waits since last restart (quick view)
SELECT TOP 20
wait_type,
SUM(wait_time_ms) AS wait_time_ms,
SUM(signal_wait_time_ms) AS signal_wait_time_ms,
SUM(waiting_tasks_count) AS waiting_tasks
FROM sys.dm_os_wait_stats
WHERE wait_type NOT LIKE ‘SLEEP%’ AND wait_type NOT LIKE ‘BROKER_TASK_STOP%’
GROUP BY wait_type
ORDER BY SUM(wait_time_ms) DESC;
C) Currently running and blocking
-- Who's running now / potential blockers
SELECT
r.session_id,
r.status,
r.blocking_session_id,
r.cpu_time,
r.total_elapsed_time,
r.wait_type,
r.wait_time,
r.database_id,
SUBSTRING(t.text, (r.statement_start_offset/2)+1,
CASE WHEN r.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), t.text)) * 2
ELSE (r.statement_end_offset - r.statement_start_offset)
END / 2) AS statement_text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t
WHERE r.session_id <> @@SPID
ORDER BY r.total_elapsed_time DESC;
D) Get the plan text for a specific plan_handle
After you save A–C, pick a problem plan from the results and fetch its text plan:
-- Replace 0x... with the plan_handle from the A) result set
SELECT
DB_NAME(st.dbid) AS database_name,
qp.query_plan,
st.text
FROM sys.dm_exec_text_query_plan(@plan_handle, DEFAULT, DEFAULT) AS qp
CROSS APPLY sys.dm_exec_sql_text(qp.plan_handle) AS st;
sys.dm_exec_text_query_plan returns the human-readable plan text, handy for diffing in Git. Microsoft Learn
Tip: If you like, also save a
.sqlplanfrom SSMS for visual inspection.
Step 3 — Export SolarWinds DPA insights
Open DPA and export artifacts that correlate to the same time window:
- Top SQL Statements report (CSV or PDF).
- Wait Time trend for the instance (PNG or PDF).
- Instance or Query Detail for a hot statement (CSV, HTML, or screenshot).
Place exports into ./dpa/. DPA focuses on workload-level query performance and wait analysis; it includes exports and report sharing options. If you have the Orion Platform for reporting, you can also export reports as Excel, HTML, or images. SolarWindsSolarWinds Documentation
If your DPA version supports custom entity exports, you can back up customizations too, though for this workflow CSV/PDF is enough. SolarWinds Documentation
Step 4 — Bring everything into VS Code
Put your CSVs, HTML/PDF exports, and any .sqlplan files under:
sql-perf-investigation/
dmv/*.csv
dpa/*.csv|*.pdf|*.png
screenshots/*.png
Open the folder in VS Code. Commit it to a lightweight Git repo if you want history.
Step 5 — Ask Amazon Q to analyze your evidence
Open the Amazon Q panel in VS Code and reference the files directly in your prompts. Q can read open files and the workspace, so keep the tab active or mention filenames.
Example prompts
- “Read
./dmv/top_cpu.csvand./dpa/top_sql.csv. Group queries by normalized text, call out outliers by avg CPU and reads, and cross-check with DPA’s wait profile. What three queries should I tune first and why?” - “Here’s
./dmv/blocking_now.csv. Identify the blocking chain, the lead blocker, and likely causes. Recommend the smallest safe fixes.” - “Given this plan text in
./dmv/plan_text.sql, explain the key operators that cause high reads. Suggest index or rewrite options, and warn about parameter sniffing if you see it.”
Amazon Q’s chat in the IDE is designed to reason over code and files you select or open, summarize content, and generate or refactor code and scripts. AWS Documentation
If you want Q to produce runnable remediation scripts, ask for them explicitly:
- “Generate T-SQL to create a narrow covering index that reduces key lookups for the
OrderLineshot path you identified. Keep index keys small and include columns used only in SELECT. Provide a rollback script as well.”
Step 6 — Turn findings into actionable scripts
Here are useful templates to ask Q to fill in based on the evidence.
Index recommendation scaffold
-- Example: targeted covering index (adjust names/columns)
CREATE NONCLUSTERED INDEX IX_OrderLines_OrderId_Cover
ON dbo.OrderLines (OrderId, LineStatus)
INCLUDE (Sku, Qty, UnitPrice);
GO
-- Validate improvement: re-run DMV snapshot and compare avg reads/CPU
Parameter sniffing guardrail
-- Force recompile or set OPTIMIZE FOR to stabilize bad plans
-- Use carefully and test first
SELECT ...
FROM dbo.Orders o
JOIN dbo.OrderLines l ON l.OrderId = o.Id
WHERE o.OrderDate >= @FromDate
OPTION (RECOMPILE); -- or OPTIMIZE FOR (@FromDate = '2025-01-01');
Blocker diagnosis helper
-- Who's blocking whom, with input buffer for the blocker
SELECT r.session_id, r.blocking_session_id, t.text, r.wait_type, r.wait_time
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) t
WHERE r.blocking_session_id <> 0;
-- Inspect what the blocker is running
DBCC INPUTBUFFER(<blocking_session_id>);
(You can have Q write these with your table names once it has the CSVs and plan text.)
Optional — Give Amazon Q a memory with a Knowledge Base
If you want Q to answer questions over weeks and months, publish your exports to S3 and index them with Amazon Bedrock Knowledge Bases or Amazon Q Business. High-level flow:
- Create a Knowledge Base in Bedrock and connect it to an S3 bucket that receives your CSVs, HTML/PDFs, and plan text. The service will ingest and index the documents for retrieval-augmented answers. AWS Documentation+2AWS Documentation+2
- Or stand up an Amazon Q Business app and connect your S3 (and other sources) as data sources, then sync. AWS Documentation+2AWS Documentation+2
- Point your team to Q, and ask questions like: “Show me last month’s top three queries by wait time across all instances and link the DPA reports.”
This keeps your performance history queryable as your environment changes.
Step 7 — Automate the capture
You can schedule both DMV and DPA exports and drop them into the workspace or S3.
- DMVs: a SQL Agent job or PowerShell script that runs your DMV queries and writes CSVs with timestamps.
- DPA: schedule reports and export or use the platform’s report exports to CSV/HTML and save them. SolarWinds Documentation
Example PowerShell stub to call sqlcmd and write CSV:
$ts = Get-Date -Format "yyyyMMdd-HHmm"
$sql = Get-Content ".\dmv\top_cpu.sql" -Raw
sqlcmd -S myserver -d master -E -Q "$sql" -s "," -W | Out-File ".\dmv\top_cpu-$ts.csv" -Encoding UTF8
If you’re using the knowledge base route, push to S3 and trigger a sync.
Example session in VS Code
- You open
top_cpu-20250812.csvanddpa/top_sql-20250812.csv. - In the Q chat:
“Compare these two files. Normalize query texts, identify the worst offenders by avg CPU and reads, and match them to the highest waits fromwaits-20250812.csv. Suggest fixes and produce T-SQL scripts.” - Q summarizes outliers, explains why a plan is scanning, and generates a covering index plus a test script.
- You run the scripts in a non-prod DB, re-capture DMVs, and ask Q to diff before vs after.
What Amazon Q is good at in this workflow
- Comparing multiple CSVs and calling out outliers.
- Explaining query plans in plain English and proposing tuning paths.
- Drafting T-SQL fix scripts and rollback steps.
- Turning your notes into a clean change record.
For coding and analysis inside your editor, this is exactly what Q’s IDE chat is for. AWS Documentation
Troubleshooting and tips
- No files detected by Q? Open the files in tabs or mention exact paths in your prompt so Q pulls them into context. AWS Documentation
- DPA export formats: If CSV isn’t available for a specific view in your version, export HTML or take a screenshot and ask Q to work from the data you provide. The SolarWinds platform also supports exporting many reports to Excel/HTML/image. SolarWinds Documentation
- Plan cache is volatile: DMV stats reset on restart or plan eviction. Capture during the problem window and save snapshots.
sys.dm_exec_query_statsonly shows queries whose plans are still cached. Microsoft Learn - Keep Q updated: Install extension updates promptly so you have the latest capabilities and security fixes. Visual Studio Marketplace
Wrap-up
You now have a simple loop: capture DMVs, export a couple of DPA views, drop them into a VS Code workspace, and let Amazon Q help you triage and tune. If you add a knowledge base on S3, you also get a growing, queryable memory of your environment that Q can reference during future incidents. AWS Documentation+2AWS Documentation+2
Discover more from SQLYARD
Subscribe to get the latest posts sent to your email.


