Skip to content

Commit

Permalink
[pre-commit.ci] pre-commit autoupdate (#2107)
Browse files Browse the repository at this point in the history
Signed-off-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Sun, Xuehao <[email protected]>
  • Loading branch information
pre-commit-ci[bot] and XuehaoSun authored Jan 10, 2025
1 parent 1c0fabb commit 14b8176
Show file tree
Hide file tree
Showing 24 changed files with 29 additions and 31 deletions.
1 change: 1 addition & 0 deletions .azure-pipelines/scripts/codeScan/codespell/inc_dict.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
activ
ans
assertin
datas
ende
lates
Expand Down
12 changes: 6 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ ci:

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
rev: v5.0.0
hooks:
- id: end-of-file-fixer
files: (.*\.(py|md|rst|yaml|yml))$
Expand Down Expand Up @@ -101,7 +101,7 @@ repos:
)$
- repo: https://github.com/psf/black.git
rev: 24.3.0
rev: 24.10.0
hooks:
- id: black
files: (.*\.py)$
Expand All @@ -115,12 +115,12 @@ repos:
)$
- repo: https://github.com/asottile/blacken-docs
rev: 1.16.0
rev: 1.19.1
hooks:
- id: blacken-docs
args: [--line-length=120, --skip-errors]
additional_dependencies:
- black==24.3.0
- black==24.10.0
exclude: |
(?x)^(
examples/.+|
Expand All @@ -130,7 +130,7 @@ repos:
)$
- repo: https://github.com/codespell-project/codespell
rev: v2.2.6
rev: v2.3.0
hooks:
- id: codespell
args: [-w]
Expand All @@ -149,7 +149,7 @@ repos:
)$
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.3.5
rev: v0.8.6
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix, --no-cache]
Expand Down
2 changes: 1 addition & 1 deletion docs/source/CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
level of experience, education, socioeconomic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.

## Our Standards
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1241,7 +1241,7 @@ def loadDataset(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ def __init__(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
2 changes: 1 addition & 1 deletion examples/notebook/dynas/BERT_SST2_Supernet_NAS.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@
"# Perform Search\n",
"\n",
"After the DyNAS configuration parameters are set, the search process can be started. Depending on how many evaluations `config.dynas.num_evals` were defined, the search time can vary from hours to days. \n",
"The search process will populate the `config.dynas.results_csv_path` file and will also return a list of the final iteration's best sub-network population recommondation. \n",
"The search process will populate the `config.dynas.results_csv_path` file and will also return a list of the final iteration's best sub-network population recommendation. \n",
"\n",
"Note: example search results are provided for the plotting section if you wish to skip this step for now. "
]
Expand Down
2 changes: 1 addition & 1 deletion examples/notebook/dynas/MobileNetV3_Supernet_NAS.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@
"# Perform Search\n",
"\n",
"After the DyNAS configuration parameters are set, the search process can be started. Depending on how many evaluations `config.dynas.num_evals` were defined, the search time can vary from hours to days. \n",
"The search process will populate the `config.dynas.results_csv_path` file and will also return a list of the final iteration's best sub-network population recommondation. \n",
"The search process will populate the `config.dynas.results_csv_path` file and will also return a list of the final iteration's best sub-network population recommendation. \n",
"\n",
"Note: example search results are provided for the plotting section if you wish to skip this step for now. "
]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ def postprocess_qa_predictions_with_beam_search(

assert len(predictions[0]) == len(
features
), f"Got {len(predictions[0])} predicitions and {len(features)} features."
), f"Got {len(predictions[0])} predictions and {len(features)} features."

# Build a map example to its corresponding features.
example_id_to_index = {k: i for i, k in enumerate(examples["id"])}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -288,7 +288,7 @@ def postprocess_qa_predictions_with_beam_search(

assert len(predictions[0]) == len(
features
), f"Got {len(predictions[0])} predicitions and {len(features)} features."
), f"Got {len(predictions[0])} predictions and {len(features)} features."

# Build a map example to its corresponding features.
example_id_to_index = {k: i for i, k in enumerate(examples["id"])}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,7 @@ def __init__(self, label_num=81, backbone='resnet34', model_path="./resnet34-333
self.size=(1200,1200)
dboxes = dboxes_R34_coco(list(self.size),[3,3,2,2,2,2])
self.encoder = Encoder(dboxes)
# intitalize all weights
# initialize all weights
self._init_weights()
self.device = 1
def _build_additional_features(self, input_channels):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ def __init__(self, label_num, backbone='resnet34', model_path="./resnet34-333f7e

self.loc = nn.ModuleList(self.loc)
self.conf = nn.ModuleList(self.conf)
# intitalize all weights
# initialize all weights
self._init_weights()

def _build_additional_features(self, input_size, input_channels):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ def __init__(self, label_num, backbone='resnet34', model_path="./resnet34-333f7e
self.loc = nn.ModuleList(self.loc)
self.conf = nn.ModuleList(self.conf)

# intitalize all weights
# initialize all weights
self._init_weights()

def _build_additional_features(self, input_channels):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def __init__(self, label_num, backbone='resnet34', model_path=None):

self.loc = nn.ModuleList(self.loc)
self.conf = nn.ModuleList(self.conf)
# intitalize all weights
# initialize all weights
self._init_weights()

def _build_additional_features(self, input_size, input_channels):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1160,7 +1160,7 @@ def loadDataset(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ def __init__(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1241,7 +1241,7 @@ def loadDataset(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ def __init__(
data_ready = False

# pre-process data if needed
# WARNNING: when memory mapping is used we get a collection of files
# WARNING: when memory mapping is used we get a collection of files
if data_ready:
print("Reading pre-processed data=%s" % (str(pro_data)))
file = str(pro_data)
Expand Down
2 changes: 1 addition & 1 deletion neural_compressor/adaptor/adaptor.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def __init__(self, framework_specific_info):

@abstractmethod
def quantize(self, tune_cfg, model, dataloader, q_func=None):
"""The function is used to do calibration and quanitization in post-training quantization.
"""The function is used to do calibration and quantization in post-training quantization.
Args:
tune_cfg(dict): The chosen tuning configuration.
Expand Down
2 changes: 1 addition & 1 deletion neural_compressor/adaptor/mxnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ def __init__(self, framework_specific_info):

@dump_elapsed_time("Pass quantize model")
def quantize(self, tune_cfg, nc_model, dataloader, q_func=None):
"""The function is used to do MXNet calibration and quanitization in post-training
"""The function is used to do MXNet calibration and quantization in post-training
quantization.
Args:
Expand Down
4 changes: 2 additions & 2 deletions neural_compressor/adaptor/onnxrt.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,7 +252,7 @@ def _need_smooth_quant(self, tune_cfg) -> bool:

@dump_elapsed_time("Pass quantize model")
def quantize(self, tune_cfg, model, data_loader, q_func=None):
"""The function is used to do calibration and quanitization in post-training
"""The function is used to do calibration and quantization in post-training
quantization.
Args:
Expand Down Expand Up @@ -1853,7 +1853,7 @@ def __init__(self, framework_specific_info):

@dump_elapsed_time("Pass quantize model")
def quantize(self, tune_cfg, model, data_loader, q_func=None):
"""The function is used to do calibration and quanitization in post-training
"""The function is used to do calibration and quantization in post-training
quantization.
Args:
Expand Down
2 changes: 1 addition & 1 deletion neural_compressor/adaptor/tf_utils/graph_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -212,7 +212,7 @@ def query_fusion_pattern_nodes(self, patterns=None):
return self._search_patterns(patterns)

def _search_patterns(self, input_pattern):
"""Search user specified patterns on internal grpah structure.
"""Search user specified patterns on internal graph structure.
Args:
input_pattern (list): The element of the pattern list could be string/list/tuple.
Expand Down
3 changes: 0 additions & 3 deletions neural_compressor/adaptor/torch_utils/hawq_metric.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,11 @@
import logging

import numpy as np
import torch.nn
import torch.nn as nn
from torch.quantization.quantize_fx import fuse_fx

logger = logging.getLogger(__name__)
from typing import Any, Callable, Dict, List, Optional, Set, Union

import torch
import tqdm


Expand Down
2 changes: 1 addition & 1 deletion neural_compressor/data/datasets/dummy_dataset_v2.py
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ def __init__(
self.label_shape = len(self.dense_shape) * self.label_shape
assert len(self.label_shape) == len(
self.dense_shape
), "length of dense_shape should be euqal to length of label_shape"
), "length of dense_shape should be equal to length of label_shape"
self.label_dim = len(self.label_shape)

self.input_dim = 1 if isinstance(dense_shape, tuple) else len(dense_shape)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -212,7 +212,7 @@ def query_fusion_pattern_nodes(self, patterns=None):
return self._search_patterns(patterns)

def _search_patterns(self, input_pattern):
"""Search user specified patterns on internal grpah structure.
"""Search user specified patterns on internal graph structure.
Args:
input_pattern (list): The element of the pattern list could be string/list/tuple.
Expand Down

0 comments on commit 14b8176

Please sign in to comment.