Jump to content
vLLM
Pipelines
Test Suites
Log In
Sign Up
Learn More...
Pipelines
Test Suites
🐎
Performance Benchmark
Public
Builds
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
KuntaiDu:kuntai-benchmark-dev
All users
All states
All dates
visual adjustment
#3866
32s
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
8260d3889
·
Created
Thu 4th Jul 2024 at 7:05 AM
Loading steps…
adjust visualization
#3865
7m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
3a70a60aa
·
Created
Thu 4th Jul 2024 at 6:53 AM
Loading steps…
support figure visualization
#3859
6m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
7c845ae8a
·
Created
Thu 4th Jul 2024 at 6:32 AM
Loading steps…
add visualization step
#3855
29m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
e27677ae4
·
Created
Thu 4th Jul 2024 at 5:50 AM
Loading steps…
remove headers in result
#3817
4h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
a3e4355c2
·
Created
Wed 3rd Jul 2024 at 11:15 PM
Loading steps…
reduce nightly pipeline length
#3816
7m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
59072ed19
·
Created
Wed 3rd Jul 2024 at 11:07 PM
Loading steps…
remove annotation inside the job --- run the annotation at the last.
#3815
8s
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
22e78b5a7
·
Created
Wed 3rd Jul 2024 at 11:06 PM
Loading steps…
add standard deviation for each metric -- to plot confidence interval
#3803
35m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
c5e666209
·
Created
Wed 3rd Jul 2024 at 10:30 PM
Loading steps…
Merge branch 'vllm-project:main' into kuntai-benchmark-dev
#3802
4m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
2e577b39d
·
Created
Wed 3rd Jul 2024 at 10:25 PM
Loading steps…
freeze fp16 benchmark
#3783
3h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
7b483a128
·
Created
Wed 3rd Jul 2024 at 6:45 PM
Loading steps…
reduce calib size
#3721
3h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
0313c19e8
·
Created
Wed 3rd Jul 2024 at 7:01 AM
Loading steps…
test fp8 performance
#3718
24m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
b8dbd8ac9
·
Created
Wed 3rd Jul 2024 at 6:35 AM
Loading steps…
change model
#3643
37m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
44e2d9715
·
Created
Tue 2nd Jul 2024 at 9:11 PM
Loading steps…
move kv cache dtype inside vllm
#3639
7m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
3d20f9235
·
Created
Tue 2nd Jul 2024 at 9:02 PM
Loading steps…
use llama2 for local debugging
#3560
6h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
019802a93
·
Created
Tue 2nd Jul 2024 at 6:56 AM
Loading steps…
add fp8 for vllm
#3559
18m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
459fb2f01
·
Created
Tue 2nd Jul 2024 at 6:37 AM
Loading steps…
move fp8 quantization to common parameters
#3558
20m
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
f1a795557
·
Created
Tue 2nd Jul 2024 at 6:16 AM
Loading steps…
reduce test case to only mixtral, debug lmdeploy + mixtral
#3545
1h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
162700f10
·
Created
Tue 2nd Jul 2024 at 5:05 AM
Loading steps…
bug fix: need to use llama checkpoint converter for mixtral model
#3425
5h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
6c566cbe7
·
Created
Mon 1st Jul 2024 at 6:15 PM
Loading steps…
bring back the full test suite
#3335
1h
Kuntai Du
·
KuntaiDu:kuntai-benchmark-dev
GitHub Icon
96bc2490c
·
Created
Mon 1st Jul 2024 at 8:25 AM
Loading steps…
‹ Prev
Next ›