Hi,
This article will discuss how to effectively manage the archiving of the DCS database based on yoru retention needs and / or performance requirements This process allows you to minimise the amount of events stored within your active SCSPDB_[Name] database whilst also maintaining your audit requirements.
This is likely necessary if your DCS environment generates alot of noise, most likely detection events if you're centrally logging events in DCS.
In this example, the customers has an event threshold of say 6 months worth of events, some prevetion but mainly detection.
- Connect to your Database instance that hosts your DCS database
- Navigate to your database, typically named SCSPDB
- Navigate to Tables and run the code below. Change the date to something more appropriate to your environment. Choose a time that is say 1 week in the past, to ensure your agents have checked in, and that your systems have some overlap. Again 1 week should be enough to ensure this, unless you're experiencing some very serious post delay.
select event_type,count(1) from cspevent cs where cs.event_dt <= '2017-03-18 00:00:00.000' group by event_type
- You will use to verify that your database has been migrated with some integrity. There will be some disparity between the Total events in the archived DB, hence the date below, which is used as a timestamp to ensure some additional integrity.
- Record the results for reference later.
- Create a backup of the database (manually through SSMS or via your automated backup solution)
- Restore your new backup, but name it differently i.e. SCSPDB_Review_JantoJun2016
- Run the same code as step 3 on the restored database and ensure the record counts match. If they do not, start from step 3 again and double check the figures and that the backups processed properly.
- If they do, then on the original DB run the Purge Script found under Programmability - SCSP_PurgeEvents (6.5), PurgeEventsByDate(6.6 onwards. An example of the code is shown below for a 6.5 purge script. This will delete all Realtime events, that are older than 7 days and it will delete as many as it can as fast as it can. Change purge limit to say 100,000 if you want to control the performance of the DB / minimise any table locking.
DECLARE @RC int DECLARE @EventCLASS nvarchar(100) = 'Realtime' One of "REALTIME", "PROFILE", "ANALYSIS" DECLARE @PurgeMode nvarchar(100) = 'Purge' -- One of "TESTMODE","PURGE" (Testmode will show what will happen but does not actually delete anything, Purge does!) DECLARE @FilterMode nvarchar(100) = 'Days' DECLARE @FilterValue nvarchar(4000) = '7' -- Number of days to keep (anything older will be deleted) DECLARE @PurgeLimit int = 0 --or 100,000 This is a "governor" to limit how many records to delete at once DECLARE @Process_Rules varchar(8) = 'P' -- Flags indicating processing mode. P print, Q quiet -- TODO: Set parameter values here. EXECUTE @RC = [DCSSA_Review].[dbo].[SCSP_PurgeEvents] @EventCLASS ,@PurgeMode ,@FilterMode ,@FilterValue ,@PurgeLimit ,@Process_Rules GO
- NOTE: In 6.6 onwards the script has changed, and TESTMODE actually purges the events, be careful
Now you'll have a slim line CSP database that is quicker to query and less cluttered, and a review DB that you can use say, direct SQL and / or SSRS to create KPIs and / or analsys on the data. Tip: You can extract the SQL from the CSP reports in the Java Console and use that to query directly or via SSRS etc.
Any questions, let me know.
Thanks,
Kevin