Skip to content

Commit a3e2b99

Browse files
authored
AD model performance benchmark (opensearch-project#728)
* AD model performance benchmark This PR adds an AD model performance benchmark so that we can compare model performance across versions. For the single stream detector, we refactored tests in DetectionResultEvalutationIT and moved it to SingleStreamModelPerfIT. For the HCAD detector, we randomly generated synthetic data with known anomalies inserted throughout the signal. In particular, these are one/two/four dimensional data where each dimension is a noisy cosine wave. Anomalies are inserted into one dimension with 0.003 probability. Anomalies across each dimension can be independent or dependent. We have approximately 5000 observations per data set. The data set is generated using the same random seed so the result is comparable across versions. We also backported opensearch-project#600 so that we can capture the performance data in CI output. We also fixed opensearch-project#712 by revising the client setup code. Testing done: * added unit tests to run the benchmark. Signed-off-by: Kaituo Li <kaituo@amazon.com>
1 parent d8f0c35 commit a3e2b99

13 files changed

+1179
-624
lines changed

.github/workflows/benchmark.yml

+33
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
name: Run AD benchmark
2+
on:
3+
push:
4+
branches:
5+
- "*"
6+
pull_request:
7+
branches:
8+
- "*"
9+
10+
jobs:
11+
Build-ad:
12+
strategy:
13+
matrix:
14+
java: [17]
15+
fail-fast: false
16+
17+
name: Run Anomaly detection model performance benchmark
18+
runs-on: ubuntu-latest
19+
20+
steps:
21+
- name: Setup Java ${{ matrix.java }}
22+
uses: actions/setup-java@v1
23+
with:
24+
java-version: ${{ matrix.java }}
25+
26+
# anomaly-detection
27+
- name: Checkout AD
28+
uses: actions/checkout@v2
29+
30+
- name: Build and Run Tests
31+
run: |
32+
./gradlew ':test' --tests "org.opensearch.ad.ml.HCADModelPerfTests" -Dtests.seed=2AEBDBBAE75AC5E0 -Dtests.security.manager=false -Dtests.locale=es-CU -Dtests.timezone=Chile/EasterIsland -Dtest.logs=true -Dmodel-benchmark=true
33+
./gradlew integTest --tests "org.opensearch.ad.e2e.SingleStreamModelPerfIT" -Dtests.seed=60CDDB34427ACD0C -Dtests.security.manager=false -Dtests.locale=kab-DZ -Dtests.timezone=Asia/Hebron -Dtest.logs=true -Dmodel-benchmark=true

.github/workflows/test_build_multi_platform.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ jobs:
6060
./gradlew assemble
6161
- name: Build and Run Tests
6262
run: |
63-
./gradlew build -Dtest.logs=true
63+
./gradlew build
6464
- name: Publish to Maven Local
6565
run: |
6666
./gradlew publishToMavenLocal

DEVELOPER_GUIDE.md

+2
Original file line numberDiff line numberDiff line change
@@ -48,6 +48,8 @@ Currently we just put RCF jar in lib as dependency. Plan to publish to Maven and
4848
8. `./gradlew adBwcCluster#rollingUpgradeClusterTask -Dtests.security.manager=false` launches a cluster with three nodes of bwc version of OpenSearch with anomaly-detection and job-scheduler and tests backwards compatibility by performing rolling upgrade of each node with the current version of OpenSearch with anomaly-detection and job-scheduler.
4949
9. `./gradlew adBwcCluster#fullRestartClusterTask -Dtests.security.manager=false` launches a cluster with three nodes of bwc version of OpenSearch with anomaly-detection and job-scheduler and tests backwards compatibility by performing a full restart on the cluster upgrading all the nodes with the current version of OpenSearch with anomaly-detection and job-scheduler.
5050
10. `./gradlew bwcTestSuite -Dtests.security.manager=false` runs all the above bwc tests combined.
51+
11. `./gradlew ':test' --tests "org.opensearch.ad.ml.HCADModelPerfTests" -Dtests.seed=2AEBDBBAE75AC5E0 -Dtests.security.manager=false -Dtests.locale=es-CU -Dtests.timezone=Chile/EasterIsland -Dtest.logs=true -Dmodel-benchmark=true` launches HCAD model performance tests and logs the result in the standard output
52+
12. `./gradlew integTest --tests "org.opensearch.ad.e2e.SingleStreamModelPerfIT" -Dtests.seed=60CDDB34427ACD0C -Dtests.security.manager=false -Dtests.locale=kab-DZ -Dtests.timezone=Asia/Hebron -Dtest.logs=true -Dmodel-benchmark=true` launches single stream AD model performance tests and logs the result in the standard output
5153

5254
When launching a cluster using one of the above commands logs are placed in `/build/cluster/run node0/opensearch-<version>/logs`. Though the logs are teed to the console, in practices it's best to check the actual log file.
5355

build.gradle

+15-4
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,6 @@ configurations.all {
139139
if (it.state != Configuration.State.UNRESOLVED) return
140140
resolutionStrategy {
141141
force "joda-time:joda-time:${versions.joda}"
142-
force "com.fasterxml.jackson.core:jackson-core:2.13.4"
143142
force "commons-logging:commons-logging:${versions.commonslogging}"
144143
force "org.apache.httpcomponents:httpcore5:${versions.httpcore5}"
145144
force "commons-codec:commons-codec:${versions.commonscodec}"
@@ -219,6 +218,12 @@ test {
219218
}
220219
include '**/*Tests.class'
221220
systemProperty 'tests.security.manager', 'false'
221+
222+
if (System.getProperty("model-benchmark") == null || System.getProperty("model-benchmark") == "false") {
223+
filter {
224+
excludeTestsMatching "org.opensearch.ad.ml.HCADModelPerfTests"
225+
}
226+
}
222227
}
223228

224229
task integTest(type: RestIntegTestTask) {
@@ -264,6 +269,12 @@ integTest {
264269
}
265270
}
266271

272+
if (System.getProperty("model-benchmark") == null || System.getProperty("model-benchmark") == "false") {
273+
filter {
274+
excludeTestsMatching "org.opensearch.ad.e2e.SingleStreamModelPerfIT"
275+
}
276+
}
277+
267278
// The 'doFirst' delays till execution time.
268279
doFirst {
269280
// Tell the test JVM if the cluster JVM is running under a debugger so that tests can
@@ -664,9 +675,9 @@ dependencies {
664675
implementation 'software.amazon.randomcutforest:randomcutforest-core:3.0-rc3'
665676

666677
// force Jackson version to avoid version conflict issue
667-
implementation "com.fasterxml.jackson.core:jackson-core:2.13.4"
668-
implementation "com.fasterxml.jackson.core:jackson-databind:2.13.4.2"
669-
implementation "com.fasterxml.jackson.core:jackson-annotations:2.13.4"
678+
// we inherit jackson-core from opensearch core
679+
implementation "com.fasterxml.jackson.core:jackson-databind:2.14.0"
680+
implementation "com.fasterxml.jackson.core:jackson-annotations:2.14.0"
670681

671682
// used for serializing/deserializing rcf models.
672683
implementation group: 'io.protostuff', name: 'protostuff-core', version: '1.8.0'

src/test/java/org/opensearch/ad/ODFERestTestCase.java

+13-10
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,8 @@
1111

1212
package org.opensearch.ad;
1313

14+
import static org.opensearch.client.RestClientBuilder.DEFAULT_MAX_CONN_PER_ROUTE;
15+
import static org.opensearch.client.RestClientBuilder.DEFAULT_MAX_CONN_TOTAL;
1416
import static org.opensearch.commons.ConfigConstants.OPENSEARCH_SECURITY_SSL_HTTP_ENABLED;
1517
import static org.opensearch.commons.ConfigConstants.OPENSEARCH_SECURITY_SSL_HTTP_KEYSTORE_FILEPATH;
1618
import static org.opensearch.commons.ConfigConstants.OPENSEARCH_SECURITY_SSL_HTTP_KEYSTORE_KEYPASSWORD;
@@ -186,21 +188,18 @@ protected static void configureHttpsClient(RestClientBuilder builder, Settings s
186188
.ofNullable(System.getProperty("password"))
187189
.orElseThrow(() -> new RuntimeException("password is missing"));
188190
BasicCredentialsProvider credentialsProvider = new BasicCredentialsProvider();
189-
credentialsProvider
190-
.setCredentials(
191-
new AuthScope(new HttpHost("localhost", 9200)),
192-
new UsernamePasswordCredentials(userName, password.toCharArray())
193-
);
191+
final AuthScope anyScope = new AuthScope(null, -1);
192+
credentialsProvider.setCredentials(anyScope, new UsernamePasswordCredentials(userName, password.toCharArray()));
194193
try {
195194
final TlsStrategy tlsStrategy = ClientTlsStrategyBuilder
196195
.create()
197-
.setSslContext(SSLContextBuilder.create().loadTrustMaterial(null, (chains, authType) -> true).build())
198-
// disable the certificate since our testing cluster just uses the default security configuration
199196
.setHostnameVerifier(NoopHostnameVerifier.INSTANCE)
197+
.setSslContext(SSLContextBuilder.create().loadTrustMaterial(null, (chains, authType) -> true).build())
200198
.build();
201-
202199
final PoolingAsyncClientConnectionManager connectionManager = PoolingAsyncClientConnectionManagerBuilder
203200
.create()
201+
.setMaxConnPerRoute(DEFAULT_MAX_CONN_PER_ROUTE)
202+
.setMaxConnTotal(DEFAULT_MAX_CONN_TOTAL)
204203
.setTlsStrategy(tlsStrategy)
205204
.build();
206205
return httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider).setConnectionManager(connectionManager);
@@ -212,8 +211,12 @@ protected static void configureHttpsClient(RestClientBuilder builder, Settings s
212211
final String socketTimeoutString = settings.get(CLIENT_SOCKET_TIMEOUT);
213212
final TimeValue socketTimeout = TimeValue
214213
.parseTimeValue(socketTimeoutString == null ? "60s" : socketTimeoutString, CLIENT_SOCKET_TIMEOUT);
215-
builder
216-
.setRequestConfigCallback(conf -> conf.setResponseTimeout(Timeout.ofMilliseconds(Math.toIntExact(socketTimeout.getMillis()))));
214+
builder.setRequestConfigCallback(conf -> {
215+
Timeout timeout = Timeout.ofMilliseconds(Math.toIntExact(socketTimeout.getMillis()));
216+
conf.setConnectTimeout(timeout);
217+
conf.setResponseTimeout(timeout);
218+
return conf;
219+
});
217220
if (settings.hasValue(CLIENT_PATH_PREFIX)) {
218221
builder.setPathPrefix(settings.get(CLIENT_PATH_PREFIX));
219222
}
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,242 @@
1+
/*
2+
* SPDX-License-Identifier: Apache-2.0
3+
*
4+
* The OpenSearch Contributors require contributions made to
5+
* this file be licensed under the Apache-2.0 license or a
6+
* compatible open source license.
7+
*
8+
* Modifications Copyright OpenSearch Contributors. See
9+
* GitHub history for details.
10+
*/
11+
12+
package org.opensearch.ad.e2e;
13+
14+
import static org.opensearch.ad.TestHelpers.toHttpEntity;
15+
import static org.opensearch.ad.settings.AnomalyDetectorSettings.BACKOFF_MINUTES;
16+
import static org.opensearch.ad.settings.AnomalyDetectorSettings.MAX_RETRY_FOR_UNRESPONSIVE_NODE;
17+
18+
import java.io.File;
19+
import java.io.FileReader;
20+
import java.io.IOException;
21+
import java.io.InputStreamReader;
22+
import java.nio.charset.Charset;
23+
import java.time.Instant;
24+
import java.util.ArrayList;
25+
import java.util.List;
26+
import java.util.Locale;
27+
import java.util.Map;
28+
29+
import org.apache.hc.core5.http.HttpHeaders;
30+
import org.apache.hc.core5.http.message.BasicHeader;
31+
import org.opensearch.ad.ODFERestTestCase;
32+
import org.opensearch.ad.TestHelpers;
33+
import org.opensearch.client.Request;
34+
import org.opensearch.client.RequestOptions;
35+
import org.opensearch.client.Response;
36+
import org.opensearch.client.RestClient;
37+
import org.opensearch.client.WarningsHandler;
38+
import org.opensearch.common.Strings;
39+
import org.opensearch.common.xcontent.XContentBuilder;
40+
import org.opensearch.common.xcontent.json.JsonXContent;
41+
42+
import com.google.common.collect.ImmutableList;
43+
import com.google.gson.JsonArray;
44+
import com.google.gson.JsonObject;
45+
import com.google.gson.JsonParser;
46+
47+
public class AbstractSyntheticDataTest extends ODFERestTestCase {
48+
/**
49+
* In real time AD, we mute a node for a detector if that node keeps returning
50+
* ResourceNotFoundException (5 times in a row). This is a problem for batch mode
51+
* testing as we issue a large amount of requests quickly. Due to the speed, we
52+
* won't be able to finish cold start before the ResourceNotFoundException mutes
53+
* a node. Since our test case has only one node, there is no other nodes to fall
54+
* back on. Here we disable such fault tolerance by setting max retries before
55+
* muting to a large number and the actual wait time during muting to 0.
56+
*
57+
* @throws IOException when failing to create http request body
58+
*/
59+
protected void disableResourceNotFoundFaultTolerence() throws IOException {
60+
XContentBuilder settingCommand = JsonXContent.contentBuilder();
61+
62+
settingCommand.startObject();
63+
settingCommand.startObject("persistent");
64+
settingCommand.field(MAX_RETRY_FOR_UNRESPONSIVE_NODE.getKey(), 100_000);
65+
settingCommand.field(BACKOFF_MINUTES.getKey(), 0);
66+
settingCommand.endObject();
67+
settingCommand.endObject();
68+
Request request = new Request("PUT", "/_cluster/settings");
69+
request.setJsonEntity(Strings.toString(settingCommand));
70+
71+
adminClient().performRequest(request);
72+
}
73+
74+
protected List<JsonObject> getData(String datasetFileName) throws Exception {
75+
JsonArray jsonArray = JsonParser
76+
.parseReader(new FileReader(new File(getClass().getResource(datasetFileName).toURI()), Charset.defaultCharset()))
77+
.getAsJsonArray();
78+
List<JsonObject> list = new ArrayList<>(jsonArray.size());
79+
jsonArray.iterator().forEachRemaining(i -> list.add(i.getAsJsonObject()));
80+
return list;
81+
}
82+
83+
protected Map<String, Object> getDetectionResult(String detectorId, Instant begin, Instant end, RestClient client) {
84+
try {
85+
Request request = new Request(
86+
"POST",
87+
String.format(Locale.ROOT, "/_opendistro/_anomaly_detection/detectors/%s/_run", detectorId)
88+
);
89+
request
90+
.setJsonEntity(
91+
String.format(Locale.ROOT, "{ \"period_start\": %d, \"period_end\": %d }", begin.toEpochMilli(), end.toEpochMilli())
92+
);
93+
return entityAsMap(client.performRequest(request));
94+
} catch (Exception e) {
95+
throw new RuntimeException(e);
96+
}
97+
}
98+
99+
protected void bulkIndexTrainData(
100+
String datasetName,
101+
List<JsonObject> data,
102+
int trainTestSplit,
103+
RestClient client,
104+
String categoryField
105+
) throws Exception {
106+
Request request = new Request("PUT", datasetName);
107+
String requestBody = null;
108+
if (Strings.isEmpty(categoryField)) {
109+
requestBody = "{ \"mappings\": { \"properties\": { \"timestamp\": { \"type\": \"date\"},"
110+
+ " \"Feature1\": { \"type\": \"double\" }, \"Feature2\": { \"type\": \"double\" } } } }";
111+
} else {
112+
requestBody = String
113+
.format(
114+
Locale.ROOT,
115+
"{ \"mappings\": { \"properties\": { \"timestamp\": { \"type\": \"date\"},"
116+
+ " \"Feature1\": { \"type\": \"double\" }, \"Feature2\": { \"type\": \"double\" },"
117+
+ "\"%s\": { \"type\": \"keyword\"} } } }",
118+
categoryField
119+
);
120+
}
121+
122+
request.setJsonEntity(requestBody);
123+
setWarningHandler(request, false);
124+
client.performRequest(request);
125+
Thread.sleep(1_000);
126+
127+
StringBuilder bulkRequestBuilder = new StringBuilder();
128+
for (int i = 0; i < trainTestSplit; i++) {
129+
bulkRequestBuilder.append("{ \"index\" : { \"_index\" : \"" + datasetName + "\", \"_id\" : \"" + i + "\" } }\n");
130+
bulkRequestBuilder.append(data.get(i).toString()).append("\n");
131+
}
132+
TestHelpers
133+
.makeRequest(
134+
client,
135+
"POST",
136+
"_bulk?refresh=true",
137+
null,
138+
toHttpEntity(bulkRequestBuilder.toString()),
139+
ImmutableList.of(new BasicHeader(HttpHeaders.USER_AGENT, "Kibana"))
140+
);
141+
Thread.sleep(1_000);
142+
waitAllSyncheticDataIngested(trainTestSplit, datasetName, client);
143+
}
144+
145+
protected String createDetector(
146+
String datasetName,
147+
int intervalMinutes,
148+
RestClient client,
149+
String categoryField,
150+
long windowDelayInMins
151+
) throws Exception {
152+
Request request = new Request("POST", "/_plugins/_anomaly_detection/detectors/");
153+
String requestBody = null;
154+
if (Strings.isEmpty(categoryField)) {
155+
requestBody = String
156+
.format(
157+
Locale.ROOT,
158+
"{ \"name\": \"test\", \"description\": \"test\", \"time_field\": \"timestamp\""
159+
+ ", \"indices\": [\"%s\"], \"feature_attributes\": [{ \"feature_name\": \"feature 1\", \"feature_enabled\": "
160+
+ "\"true\", \"aggregation_query\": { \"Feature1\": { \"sum\": { \"field\": \"Feature1\" } } } }, { \"feature_name\""
161+
+ ": \"feature 2\", \"feature_enabled\": \"true\", \"aggregation_query\": { \"Feature2\": { \"sum\": { \"field\": "
162+
+ "\"Feature2\" } } } }], \"detection_interval\": { \"period\": { \"interval\": %d, \"unit\": \"Minutes\" } }, "
163+
+ "\"window_delay\": { \"period\": {\"interval\": %d, \"unit\": \"MINUTES\"}},"
164+
+ "\"schema_version\": 0 }",
165+
datasetName,
166+
intervalMinutes,
167+
windowDelayInMins
168+
);
169+
} else {
170+
requestBody = String
171+
.format(
172+
Locale.ROOT,
173+
"{ \"name\": \"test\", \"description\": \"test\", \"time_field\": \"timestamp\""
174+
+ ", \"indices\": [\"%s\"], \"feature_attributes\": [{ \"feature_name\": \"feature 1\", \"feature_enabled\": "
175+
+ "\"true\", \"aggregation_query\": { \"Feature1\": { \"sum\": { \"field\": \"Feature1\" } } } }, { \"feature_name\""
176+
+ ": \"feature 2\", \"feature_enabled\": \"true\", \"aggregation_query\": { \"Feature2\": { \"sum\": { \"field\": "
177+
+ "\"Feature2\" } } } }], \"detection_interval\": { \"period\": { \"interval\": %d, \"unit\": \"Minutes\" } }, "
178+
+ "\"category_field\": [\"%s\"], "
179+
+ "\"window_delay\": { \"period\": {\"interval\": %d, \"unit\": \"MINUTES\"}},"
180+
+ "\"schema_version\": 0 }",
181+
datasetName,
182+
intervalMinutes,
183+
categoryField,
184+
windowDelayInMins
185+
);
186+
}
187+
188+
request.setJsonEntity(requestBody);
189+
Map<String, Object> response = entityAsMap(client.performRequest(request));
190+
String detectorId = (String) response.get("_id");
191+
Thread.sleep(1_000);
192+
return detectorId;
193+
}
194+
195+
protected void waitAllSyncheticDataIngested(int expectedSize, String datasetName, RestClient client) throws Exception {
196+
int maxWaitCycles = 3;
197+
do {
198+
Request request = new Request("POST", String.format(Locale.ROOT, "/%s/_search", datasetName));
199+
request
200+
.setJsonEntity(
201+
String
202+
.format(
203+
Locale.ROOT,
204+
"{\"query\": {"
205+
+ " \"match_all\": {}"
206+
+ " },"
207+
+ " \"size\": 1,"
208+
+ " \"sort\": ["
209+
+ " {"
210+
+ " \"timestamp\": {"
211+
+ " \"order\": \"desc\""
212+
+ " }"
213+
+ " }"
214+
+ " ]}"
215+
)
216+
);
217+
// Make sure all of the test data has been ingested
218+
// Expected response:
219+
// "_index":"synthetic","_type":"_doc","_id":"10080","_score":null,"_source":{"timestamp":"2019-11-08T00:00:00Z","Feature1":156.30028000000001,"Feature2":100.211205,"host":"host1"},"sort":[1573171200000]}
220+
Response response = client.performRequest(request);
221+
JsonObject json = JsonParser
222+
.parseReader(new InputStreamReader(response.getEntity().getContent(), Charset.defaultCharset()))
223+
.getAsJsonObject();
224+
JsonArray hits = json.getAsJsonObject("hits").getAsJsonArray("hits");
225+
if (hits != null
226+
&& hits.size() == 1
227+
&& expectedSize - 1 == hits.get(0).getAsJsonObject().getAsJsonPrimitive("_id").getAsLong()) {
228+
break;
229+
} else {
230+
request = new Request("POST", String.format(Locale.ROOT, "/%s/_refresh", datasetName));
231+
client.performRequest(request);
232+
}
233+
Thread.sleep(1_000);
234+
} while (maxWaitCycles-- >= 0);
235+
}
236+
237+
protected void setWarningHandler(Request request, boolean strictDeprecationMode) {
238+
RequestOptions.Builder options = RequestOptions.DEFAULT.toBuilder();
239+
options.setWarningsHandler(strictDeprecationMode ? WarningsHandler.STRICT : WarningsHandler.PERMISSIVE);
240+
request.setOptions(options.build());
241+
}
242+
}

0 commit comments

Comments
 (0)