Skip to content

Rally benchmark #1522

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 49 commits into from
Nov 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
49 commits
Select commit Hold shift + click to select a range
4ec6921
add rally subcommand in benchmark
Oct 24, 2023
5f14e8c
add rally corpus output dir
Oct 24, 2023
787276d
export GenerateRallyTrack
Oct 24, 2023
9557ce4
add rally runner
Oct 24, 2023
c9c3cc7
fix generator config yaml for system benchmark
Oct 24, 2023
8dda99d
add rally benchmark test files
Oct 24, 2023
dfd63bd
fix from CI
Oct 24, 2023
08acf94
fix from CI
Oct 24, 2023
d320aa5
fix repeated print of paramters
Oct 24, 2023
298b97f
fix ES host env variable for rally
Oct 24, 2023
373dba8
remove wait_for_data_timeout
Oct 25, 2023
918601f
changelog
Oct 25, 2023
dccc3c7
spec reference
Oct 25, 2023
ddf91d6
cr fixes
Oct 30, 2023
45d5659
fix check-static
Oct 31, 2023
e1ab9b6
remove creation of benchmark policy, get rid of input and vars
Oct 31, 2023
0508a04
Update cmd/benchmark.go
aspacca Oct 31, 2023
ab63709
Update cmd/benchmark.go
aspacca Oct 31, 2023
755a435
Update internal/benchrunner/runners/rally/metrics.go
aspacca Oct 31, 2023
a145c72
Update internal/benchrunner/runners/rally/runner.go
aspacca Oct 31, 2023
145c6ae
Update internal/benchrunner/runners/rally/metrics.go
aspacca Nov 1, 2023
6e8effc
fix cr suggestions merge
Nov 1, 2023
b423b59
remove input and vars reference in the package
Nov 1, 2023
0191b17
fix check-static
Nov 1, 2023
f7f98ca
Update internal/benchrunner/runners/rally/metrics.go
aspacca Nov 1, 2023
3447828
Update internal/benchrunner/runners/rally/metrics.go
aspacca Nov 1, 2023
9ab7a89
Update internal/benchrunner/runners/rally/runner.go
aspacca Nov 1, 2023
b9c98a4
Update internal/benchrunner/runners/rally/runner.go
aspacca Nov 1, 2023
2dd4c65
Update internal/benchrunner/runners/rally/metrics.go
aspacca Nov 1, 2023
92b7014
Update internal/benchrunner/runners/rally/runner.go
aspacca Nov 1, 2023
9ba06f5
cr and merge from github fixes
Nov 1, 2023
e3612de
move CreateRallyTrackDir to rally package and handle package installa…
Nov 1, 2023
00402c7
use package installer
Nov 2, 2023
c8a6502
Update internal/corpusgenerator/rally.go
aspacca Nov 2, 2023
b1adc98
Update internal/benchrunner/runners/rally/runner.go
aspacca Nov 2, 2023
b28b55d
handle error logging in metrics.go, use bulk API
Nov 2, 2023
b8bd588
docs
Nov 2, 2023
f6532d9
add refresh index in metrics, collect only start and end
Nov 2, 2023
592cdcb
wait only for warmup
Nov 2, 2023
671867c
remove warmup, include unloaded segment in pipeline stats
Nov 2, 2023
86a6b2f
Update internal/cobraext/flags.go
aspacca Nov 2, 2023
13672e2
docs about replay of rally tracks, bugfixes on metrics
Nov 3, 2023
3dc1a58
temporary: alias github.com/elastic/package-spec/v3 to github.com/asp…
Nov 3, 2023
9582206
remove warmup_time_period
Nov 3, 2023
bb18dc8
temporary: format version
Nov 3, 2023
5804ef9
remove replace in go mod. update to latest commit
Nov 6, 2023
cc39585
make check-static
Nov 7, 2023
943367a
github.com/elastic/package-spec/v3@v3.0.1
Nov 7, 2023
f406794
Merge branch 'main' into rally-benchmark
Nov 7, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,12 @@ These benchmarks allow you to benchmark any Ingest Node Pipelines defined by you

For details on how to configure pipeline benchmarks for a package, review the [HOWTO guide](./docs/howto/pipeline_benchmarking.md).

#### Rally Benchmarks

These benchmarks allow you to benchmark an integration corpus with rally.

For details on how to configure rally benchmarks for a package, review the [HOWTO guide](./docs/howto/rally_benchmarking.md).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this link correct? Trying to find it in this PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the link is correct. but I haven't yet written the docs :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add these to the PR? I was looking for these as I hit some issues testing the PR. Will comment on more on the issue I hit soon.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it's planned to add to this PR: in the description I listed missing docs :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will give it a try when there are docs 🙂


#### System Benchmarks

These benchmarks allow you to benchmark an integration end to end.
Expand All @@ -175,6 +181,12 @@ _Context: package_

Run pipeline benchmarks for the package.

### `elastic-package benchmark rally`

_Context: package_

Run rally benchmarks for the package (esrally needs to be installed in the path of the system).

### `elastic-package benchmark system`

_Context: package_
Expand Down
164 changes: 160 additions & 4 deletions cmd/benchmark.go
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,9 @@ import (
"github.com/elastic/elastic-package/internal/benchrunner"
"github.com/elastic/elastic-package/internal/benchrunner/reporters"
"github.com/elastic/elastic-package/internal/benchrunner/reporters/outputs"
benchcommon "github.com/elastic/elastic-package/internal/benchrunner/runners/common"
"github.com/elastic/elastic-package/internal/benchrunner/runners/pipeline"
"github.com/elastic/elastic-package/internal/benchrunner/runners/rally"
"github.com/elastic/elastic-package/internal/benchrunner/runners/system"
"github.com/elastic/elastic-package/internal/cobraext"
"github.com/elastic/elastic-package/internal/common"
Expand All @@ -48,6 +50,12 @@ These benchmarks allow you to benchmark any Ingest Node Pipelines defined by you

For details on how to configure pipeline benchmarks for a package, review the [HOWTO guide](./docs/howto/pipeline_benchmarking.md).

#### Rally Benchmarks

These benchmarks allow you to benchmark an integration corpus with rally.

For details on how to configure rally benchmarks for a package, review the [HOWTO guide](./docs/howto/rally_benchmarking.md).

#### System Benchmarks

These benchmarks allow you to benchmark an integration end to end.
Expand All @@ -66,6 +74,9 @@ func setupBenchmarkCommand() *cobraext.Command {
pipelineCmd := getPipelineCommand()
cmd.AddCommand(pipelineCmd)

rallyCmd := getRallyCommand()
cmd.AddCommand(rallyCmd)

systemCmd := getSystemCommand()
cmd.AddCommand(systemCmd)

Expand Down Expand Up @@ -213,6 +224,151 @@ func pipelineCommandAction(cmd *cobra.Command, args []string) error {
return nil
}

func getRallyCommand() *cobra.Command {
cmd := &cobra.Command{
Use: "rally",
Short: "Run rally benchmarks",
Long: "Run rally benchmarks for the package (esrally needs to be installed in the path of the system)",
Args: cobra.NoArgs,
RunE: rallyCommandAction,
}

cmd.Flags().StringP(cobraext.BenchNameFlagName, "", "", cobraext.BenchNameFlagDescription)
cmd.Flags().BoolP(cobraext.BenchReindexToMetricstoreFlagName, "", false, cobraext.BenchReindexToMetricstoreFlagDescription)
cmd.Flags().DurationP(cobraext.BenchMetricsIntervalFlagName, "", time.Second, cobraext.BenchMetricsIntervalFlagDescription)
cmd.Flags().DurationP(cobraext.DeferCleanupFlagName, "", 0, cobraext.DeferCleanupFlagDescription)
cmd.Flags().String(cobraext.VariantFlagName, "", cobraext.VariantFlagDescription)
cmd.Flags().StringP(cobraext.BenchCorpusRallyTrackOutputDirFlagName, "", "", cobraext.BenchCorpusRallyTrackOutputDirFlagDescription)
cmd.Flags().BoolP(cobraext.BenchCorpusRallyDryRunFlagName, "", false, cobraext.BenchCorpusRallyDryRunFlagDescription)

return cmd
}

func rallyCommandAction(cmd *cobra.Command, args []string) error {
cmd.Println("Run rally benchmarks for the package")

variant, err := cmd.Flags().GetString(cobraext.VariantFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.VariantFlagName)
}

benchName, err := cmd.Flags().GetString(cobraext.BenchNameFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.BenchNameFlagName)
}

deferCleanup, err := cmd.Flags().GetDuration(cobraext.DeferCleanupFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.DeferCleanupFlagName)
}

metricsInterval, err := cmd.Flags().GetDuration(cobraext.BenchMetricsIntervalFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.BenchMetricsIntervalFlagName)
}

dataReindex, err := cmd.Flags().GetBool(cobraext.BenchReindexToMetricstoreFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.BenchReindexToMetricstoreFlagName)
}

rallyTrackOutputDir, err := cmd.Flags().GetString(cobraext.BenchCorpusRallyTrackOutputDirFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.BenchCorpusRallyTrackOutputDirFlagName)
}

rallyDryRun, err := cmd.Flags().GetBool(cobraext.BenchCorpusRallyDryRunFlagName)
if err != nil {
return cobraext.FlagParsingError(err, cobraext.BenchCorpusRallyDryRunFlagName)
}

packageRootPath, found, err := packages.FindPackageRoot()
if !found {
return errors.New("package root not found")
}
if err != nil {
return fmt.Errorf("locating package root failed: %w", err)
}

profile, err := cobraext.GetProfileFlag(cmd)
if err != nil {
return err
}

signal.Enable()

esClient, err := stack.NewElasticsearchClientFromProfile(profile)
if err != nil {
return fmt.Errorf("can't create Elasticsearch client: %w", err)
}
err = esClient.CheckHealth(cmd.Context())
if err != nil {
return err
}

kc, err := stack.NewKibanaClientFromProfile(profile)
if err != nil {
return fmt.Errorf("can't create Kibana client: %w", err)
}

withOpts := []rally.OptionFunc{
rally.WithVariant(variant),
rally.WithBenchmarkName(benchName),
rally.WithDeferCleanup(deferCleanup),
rally.WithMetricsInterval(metricsInterval),
rally.WithDataReindexing(dataReindex),
rally.WithPackageRootPath(packageRootPath),
rally.WithESAPI(esClient.API),
rally.WithKibanaClient(kc),
rally.WithProfile(profile),
rally.WithRallyTrackOutputDir(rallyTrackOutputDir),
rally.WithRallyDryRun(rallyDryRun),
}

esMetricsClient, err := initializeESMetricsClient(cmd.Context())
if err != nil {
return fmt.Errorf("can't create Elasticsearch metrics client: %w", err)
}
if esMetricsClient != nil {
withOpts = append(withOpts, rally.WithESMetricsAPI(esMetricsClient.API))
}

runner := rally.NewRallyBenchmark(rally.NewOptions(withOpts...))

r, err := benchrunner.Run(runner)
if errors.Is(err, rally.ErrDryRun) {
return nil
}

if err != nil {
return fmt.Errorf("error running package rally benchmarks: %w", err)
}

multiReport, ok := r.(reporters.MultiReportable)
if !ok {
return fmt.Errorf("rally benchmark is expected to return multiple reports")
}

reports := multiReport.Split()
if len(reports) != 2 {
return fmt.Errorf("rally benchmark is expected to return a human and a file report")
}

// human report will always be the first
human := reports[0]
if err := reporters.WriteReportable(reporters.Output(outputs.ReportOutputSTDOUT), human); err != nil {
return fmt.Errorf("error writing benchmark report: %w", err)
}

// file report will always be the second
file := reports[1]
if err := reporters.WriteReportable(reporters.Output(outputs.ReportOutputFile), file); err != nil {
return fmt.Errorf("error writing benchmark report: %w", err)
}

return nil
}

func getSystemCommand() *cobra.Command {
cmd := &cobra.Command{
Use: "system",
Expand Down Expand Up @@ -410,10 +566,10 @@ func generateDataStreamCorpusCommandAction(cmd *cobra.Command, _ []string) error
}

func initializeESMetricsClient(ctx context.Context) (*elasticsearch.Client, error) {
address := os.Getenv(system.ESMetricstoreHostEnv)
user := os.Getenv(system.ESMetricstoreUsernameEnv)
pass := os.Getenv(system.ESMetricstorePasswordEnv)
cacert := os.Getenv(system.ESMetricstoreCACertificateEnv)
address := os.Getenv(benchcommon.ESMetricstoreHostEnv)
user := os.Getenv(benchcommon.ESMetricstoreUsernameEnv)
pass := os.Getenv(benchcommon.ESMetricstorePasswordEnv)
cacert := os.Getenv(benchcommon.ESMetricstoreCACertificateEnv)
if address == "" || user == "" || pass == "" {
logger.Debugf("can't initialize metricstore, missing environment configuration")
return nil, nil
Expand Down
Loading