QuickPotato

A library for the Python programming language that allows for effortless profiling with high-quality visualizations and empowers you to produce a performance angle to test-driven development.

QuickPotato is a Python library that aims to make it easier to rapidly profile your software and produce powerful code visualizations that enables you to quickly investigate where potential performance bottlenecks are hidden.

Also, QuickPotato is trying to provide you with a path to add an automated performance testing angle to your regular unit tests or test-driven development test cases allowing you to test your code early in the development life cycle in a simple, reliable, and fast way.

Installation

Install using pip or download the source code from GitHub.

pip install QuickPotato

Do note that QuickPotato hasn't released (yet) on the Python Package Index
Please just grab the source code or the latest release from GitHub for now :).

Generating Flame Graphs

python-code-flame-graph

How to interpret the Flame Graphs generated by QuickPotato together with d3-flame-graph:

  • Each box is a function in the stack
  • The y-axis shows the stack depth the top box shows what was on the CPU.
  • The x-axis does not show time but spans the population and is ordered alphabetically.
  • The width of the box show how long it was on-CPU or was part of an parent function that was on-CPU.

If you are unfamiliar with Flame Graphs you can best read about them on Brendan Greg's website.

In the following way you can generate a Python flame graph with QuickPotato:

from QuickPotato.configuration.management import options
from QuickPotato.statistical.visualizations import FlameGraph
from QuickPotato.profiling.intrusive import performance_critical

options.enable_intrusive_profiling = True  # <-- Make sure that when using intrusive profiling 
                                           #     that intrusive profiling is enabled.


@performance_critical  # <-- Make sure you attach the performance critical decorator.
def i_am_a_slow_function():
    num = 6 ** 6 ** 6
    return len(str(num))


# Generate Flame Graph
FlameGraph().export(path="C:\\Temp\\")

Generating a CSV file

csv-example

You can generate a CSV export in the following way:

from QuickPotato.configuration.management import options
from QuickPotato.statistical.visualizations import CsvFile
from QuickPotato.profiling.intrusive import performance_critical

options.enable_intrusive_profiling = True  # <-- Make sure that when using intrusive profiling 
                                           #     that intrusive profiling is enabled.


@performance_critical  # <-- Make sure you attach the performance critical decorator.
def i_am_a_slow_function():
    num = 6 ** 6 ** 6
    return len(str(num))


# export measurements to csv
CsvFile().export(path="C:\\Temp\\")

Boundary testing

Within QuickPotato, it is possible to create a performance test that validates if your code breaches any
defined boundary or not. An example of this sort of test can be found in the snippet below:

from QuickPotato.profiling.intrusive import performance_test as pt
from example.example_code import fast_method

# Define test case name
pt.test_case_name = "test_performance"

# Establish performance boundaries
pt.max_and_min_boundary_for_average = {"max": 1, "min": 0.001}

# Execute method under test
for _ in range(0, 10):
    fast_method()

# Analyse profiled results will output True if boundaries are not breached otherwise False
results = pt.verify_benchmark_against_set_boundaries

Regression testing

It is also possible to verify that there is no regression between the current benchmark and a previous baseline.
The method for creating such a test can also be found in the snippet below:

from QuickPotato.profiling.intrusive import performance_test as pt
from example.example_code import fast_method

# Define test case name
pt.test_case_name = "test_performance"

# Execute method under test
for _ in range(0, 10):
    fast_method()

# Analyse results for change True if there is no change otherwise False
results = pt.verify_benchmark_against_previous_baseline

Options you can configure

QuickPotato comes equipped with some options you can configure to make sure QuickPotato fits your needs.
Below you can find a list of all basic options:

from QuickPotato.configuration.management import options

# Profiling Settings
options.enable_intrusive_profiling = True 
options.enable_system_resource_collection = True

# Results Storage
options.connection_url = None  # <-- None will use SQlite and store results in Temp directory
options.enable_database_echo = False

# Storage Maintenance 
options.enable_auto_clean_up_old_test_results = True
options.maximum_number_saved_test_results = 10

States of options are saved in a static yaml options file.
That is why settings can be defined just once or changed on the fly.

GitHub

https://github.com/JoeyHendricks/QuickPotato