# performance-lab **Repository Path**: mirrors_handsontable/performance-lab ## Basic Information - **Project Name**: performance-lab - **Description**: JavaScript performance tests for Handsontable - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2020-08-08 - **Last Updated**: 2025-11-08 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Performance Lab JavaScript performance tests for Handsontable ## Install The minimal Node version which this project can run on is 14. Make sure that your version meets that criteria before you continue with the installation. Install dependencies via [NPM](https://npmjs.com/) ```sh $ npm install ``` Test results are stored in MongoDB instance so it is necessary to set up the DB before you run the script. If you have [docker](https://www.docker.com/) installed you can run services using [docker-compose](https://github.com/docker/compose). ```sh docker-compose -f docker/docker-compose.yml up ``` ## Run It To run performance tests and save the results to the DB execute ```sh $ ./bin/hot-perf run ``` or ```sh $ npm run start ``` Performance tests are defined in the `test/spec` directory. Each test contains code which prepares Handsontable for tests and block of code which then executes several times (defined as SAMPLE_SIZE in the `lib/config.js` file). After each call, stats are collected and after the amount of iteration hit the SAMPLE_SIZE the result is saved to the database. Once completed you can view generated test reports by running `./bin/hot-perf local-server benchmark-viewer`. It serves a page where you can compare your generated results between different Handsontable versions and different test cases. ## Usage ##### ```> ./bin/hot-perf run``` (or ```> ./bin/hot-perf r```) It runs a benchmark by running all spec files defined in the `test/spec` directory. Once completed results are saved to the database. ##### ```> ./bin/hot-perf local-server benchmark-viewer``` (or ```> ./bin/hot-perf ls bv```) Runs a local server where you can see the test results. Arguments: - ```test-runner``` (or ```tr```) - It serves a test runner page which is used by protractor to test the Handsontable. - ```benchmark-viewer``` (or ```bv```) - It serves a page which is used to view results generated by the `run` command. ### Global options: - ```--hot-version``` - Selects version of the Handsontable to test (it has to be a version which is accessible through [jsdelivr](https://www.jsdelivr.com/)). If not specified the `latest` tag is used. For example `--hot-version=6.2.2`. - ```--hot-server``` - Selects a server to be used to serve the Handsontable assets. For example `--hot-server=http://localhost:8082`. If used the assets are loaded from `dist` directory, such as `http://localhost:8082/dist/handsontable.full.css`. - ```--test-name``` - The name under which the test will be saved. For example `--test-name=my-feature`. If a test by that name is already stored, it will be replaced with the new test results. - ```--cpu-throttle-rate``` - The argument sets the CPU throttle rate for the browser. Adjusting the clock speed of the CPU slows down the computer. This can be useful for detecting slight deviations in performance that normally cannot be seen on a fast computer. It's advisable to perform the tests with rate sets as 4, for example, `--cpu-throttle-rate=4`. ## License [MIT License](https://opensource.org/licenses/MIT)