Skip to content

Commit 707eacf

Browse files
committed
docs: specify directives (#2814)
(cherry picked from commit d52cb48)
1 parent fc610e1 commit 707eacf

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

47 files changed

+129
-108
lines changed

.github/CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ def my_func(param_a: int, param_b: Optional[float] = None) -> str:
103103
>>> my_func(1, 2)
104104
3
105105
106-
.. note:: If you want to add something.
106+
.. hint:: If you want to add something.
107107
"""
108108
p = param_b if param_b else 0
109109
return str(param_a + p)

docs/source/pages/implement.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -257,7 +257,7 @@ and tests gets formatted in the following way:
257257
3. ``new_metric(...)``: essentially wraps the ``_update`` and ``_compute`` private functions into one public function that
258258
makes up the functional interface for the metric.
259259

260-
.. note::
260+
.. hint::
261261
The `functional mean squared error <https://github.com/Lightning-AI/torchmetrics/blob/master/src/torchmetrics/functional/regression/mse.py>`_
262262
metric is a is a great example of how to divide the logic.
263263

@@ -270,9 +270,9 @@ and tests gets formatted in the following way:
270270
``_new_metric_compute(...)`` function in its ``compute``. No logic should really be implemented in the module interface.
271271
We do this to not have duplicate code to maintain.
272272

273-
.. note::
274-
The module `MeanSquaredError <https://github.com/Lightning-AI/torchmetrics/blob/master/src/torchmetrics/regression/mse.py>`_
275-
metric that corresponds to the above functional example showcases these steps.
273+
.. note::
274+
The module `MeanSquaredError <https://github.com/Lightning-AI/torchmetrics/blob/master/src/torchmetrics/regression/mse.py>`_
275+
metric that corresponds to the above functional example showcases these steps.
276276

277277
4. Remember to add binding to the different relevant ``__init__`` files.
278278

@@ -291,7 +291,7 @@ and tests gets formatted in the following way:
291291
so that different combinations of inputs and parameters get tested.
292292
5. (optional) If your metric raises any exception, please add tests that showcase this.
293293

294-
.. note::
294+
.. hint::
295295
The `test file for MSE <https://github.com/Lightning-AI/torchmetrics/blob/master/tests/unittests/regression/test_mean_error.py>`_
296296
metric shows how to implement such tests.
297297

docs/source/pages/lightning.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ TorchMetrics in PyTorch Lightning
1313
TorchMetrics was originally created as part of `PyTorch Lightning <https://github.com/Lightning-AI/pytorch-lightning>`_, a powerful deep learning research
1414
framework designed for scaling models without boilerplate.
1515

16-
.. note::
16+
.. caution::
1717

1818
TorchMetrics always offers compatibility with the last 2 major PyTorch Lightning versions, but we recommend always
1919
keeping both frameworks up-to-date for the best experience.
@@ -69,9 +69,9 @@ LightningModule `self.log <https://lightning.ai/docs/pytorch/stable/extensions/l
6969
method, Lightning will log the metric based on ``on_step`` and ``on_epoch`` flags present in ``self.log(...)``. If
7070
``on_epoch`` is True, the logger automatically logs the end of epoch metric value by calling ``.compute()``.
7171

72-
.. note::
72+
.. caution::
7373

74-
``sync_dist``, ``sync_dist_group`` and ``reduce_fx`` flags from ``self.log(...)`` don't affect the metric logging
74+
The ``sync_dist``, ``sync_dist_group`` and ``reduce_fx`` flags from ``self.log(...)`` don't affect the metric logging
7575
in any manner. The metric class contains its own distributed synchronization logic.
7676

7777
This, however is only true for metrics that inherit the base class ``Metric``,
@@ -136,7 +136,7 @@ Note that logging metrics this way will require you to manually reset the metric
136136
In general, we recommend logging the metric object to make sure that metrics are correctly computed and reset.
137137
Additionally, we highly recommend that the two ways of logging are not mixed as it can lead to wrong results.
138138

139-
.. note::
139+
.. hint::
140140

141141
When using any Modular metric, calling ``self.metric(...)`` or ``self.metric.forward(...)`` serves the dual purpose
142142
of calling ``self.metric.update()`` on its input and simultaneously returning the metric value over the provided

docs/source/pages/overview.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -61,13 +61,13 @@ This metrics API is independent of PyTorch Lightning. Metrics can directly be us
6161
It is highly recommended to re-initialize the metric per mode as
6262
shown in the examples above.
6363

64-
.. note::
64+
.. caution::
6565

6666
Metric states are **not** added to the models ``state_dict`` by default.
6767
To change this, after initializing the metric, the method ``.persistent(mode)`` can
6868
be used to enable (``mode=True``) or disable (``mode=False``) this behaviour.
6969

70-
.. note::
70+
.. important::
7171

7272
Due to specialized logic around metric states, we in general do **not** recommend
7373
that metrics are initialized inside other metrics (nested metrics), as this can lead
@@ -306,7 +306,7 @@ This pattern is implemented for the following operators (with ``a`` being metric
306306
* Positive Value (``pos(a)``)
307307
* Indexing (``a[0]``)
308308

309-
.. note::
309+
.. caution::
310310

311311
Some of these operations are only fully supported from Pytorch v1.4 and onwards, explicitly we found:
312312
``add``, ``mul``, ``rmatmul``, ``rsub``, ``rmod``
@@ -381,7 +381,7 @@ inside your LightningModule. In most cases we just have to replace ``self.log``
381381
# remember to reset metrics at the end of the epoch
382382
self.valid_metrics.reset()
383383

384-
.. note::
384+
.. important::
385385

386386
`MetricCollection` as default assumes that all the metrics in the collection
387387
have the same call signature. If this is not the case, input that should be

src/torchmetrics/audio/dnsmos.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,11 +54,13 @@ class DeepNoiseSuppressionMeanOpinionScore(Metric):
5454
- ``dnsmos`` (:class:`~torch.Tensor`): float tensor of DNSMOS values reduced across the batch
5555
with shape ``(...,4)`` indicating [p808_mos, mos_sig, mos_bak, mos_ovr] in the last dim.
5656
57-
.. note:: using this metric requires you to have ``librosa``, ``onnxruntime`` and ``requests`` installed.
57+
.. hint::
58+
Using this metric requires you to have ``librosa``, ``onnxruntime`` and ``requests`` installed.
5859
Install as ``pip install torchmetrics['audio']`` or alternatively `pip install librosa onnxruntime-gpu requests`
5960
(if you do not have GPU enabled machine install `onnxruntime` instead of `onnxruntime-gpu`)
6061
61-
.. note:: the ``forward`` and ``compute`` methods in this class return a reduced DNSMOS value
62+
.. caution::
63+
The ``forward`` and ``compute`` methods in this class return a reduced DNSMOS value
6264
for a batch. To obtain the DNSMOS value for each sample, you may use the functional counterpart in
6365
:func:`~torchmetrics.functional.audio.dnsmos.deep_noise_suppression_mean_opinion_score`.
6466

src/torchmetrics/audio/pesq.py

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,12 +45,14 @@ class PerceptualEvaluationSpeechQuality(Metric):
4545
4646
- ``pesq`` (:class:`~torch.Tensor`): float tensor of PESQ value reduced across the batch
4747
48-
.. note:: using this metrics requires you to have ``pesq`` install. Either install as ``pip install
48+
.. hint::
49+
Using this metrics requires you to have ``pesq`` install. Either install as ``pip install
4950
torchmetrics[audio]`` or ``pip install pesq``. ``pesq`` will compile with your currently
5051
installed version of numpy, meaning that if you upgrade numpy at some point in the future you will
5152
most likely have to reinstall ``pesq``.
5253
53-
.. note:: the ``forward`` and ``compute`` methods in this class return a single (reduced) PESQ value
54+
.. caution::
55+
The ``forward`` and ``compute`` methods in this class return a single (reduced) PESQ value
5456
for a batch. To obtain a PESQ value for each sample, you may use the functional counterpart in
5557
:func:`~torchmetrics.functional.audio.pesq.perceptual_evaluation_speech_quality`.
5658

src/torchmetrics/audio/srmr.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -49,11 +49,12 @@ class SpeechReverberationModulationEnergyRatio(Metric):
4949
5050
- ``srmr`` (:class:`~torch.Tensor`): float scaler tensor
5151
52-
.. note:: using this metrics requires you to have ``gammatone`` and ``torchaudio`` installed.
52+
.. hint::
53+
Using this metrics requires you to have ``gammatone`` and ``torchaudio`` installed.
5354
Either install as ``pip install torchmetrics[audio]`` or ``pip install torchaudio``
5455
and ``pip install git+https://github.com/detly/gammatone``.
5556
56-
.. note::
57+
.. attention::
5758
This implementation is experimental, and might not be consistent with the matlab
5859
implementation `SRMRToolbox`_, especially the fast implementation.
5960
The slow versions, a) fast=False, norm=False, max_cf=128, b) fast=False, norm=True, max_cf=30, have

src/torchmetrics/audio/stoi.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,8 @@ class ShortTimeObjectiveIntelligibility(Metric):
5050
5151
- ``stoi`` (:class:`~torch.Tensor`): float scalar tensor
5252
53-
.. note:: using this metrics requires you to have ``pystoi`` install. Either install as ``pip install
53+
.. hint::
54+
Using this metrics requires you to have ``pystoi`` install. Either install as ``pip install
5455
torchmetrics[audio]`` or ``pip install pystoi``.
5556
5657
Args:

src/torchmetrics/classification/calibration_error.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -214,7 +214,7 @@ class MulticlassCalibrationError(Metric):
214214
- ``target`` (:class:`~torch.Tensor`): An int tensor of shape ``(N, ...)`` containing ground truth labels, and
215215
therefore only contain values in the [0, n_classes-1] range (except if `ignore_index` is specified).
216216
217-
.. note::
217+
.. tip::
218218
Additional dimension ``...`` will be flattened into the batch dimension.
219219
220220
As output to ``forward`` and ``compute`` the metric returns the following output:

src/torchmetrics/classification/cohen_kappa.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ class labels.
5050
Additionally, we convert to int tensor with thresholding using the value in ``threshold``.
5151
- ``target`` (:class:`~torch.Tensor`): An int tensor of shape ``(N, ...)``.
5252
53-
.. note::
53+
.. tip::
5454
Additional dimension ``...`` will be flattened into the batch dimension.
5555
5656
As output to ``forward`` and ``compute`` the metric returns the following output:
@@ -175,7 +175,7 @@ class labels.
175175
convert probabilities/logits into an int tensor.
176176
- ``target`` (:class:`~torch.Tensor`): An int tensor of shape ``(N, ...)``.
177177
178-
.. note::
178+
.. tip::
179179
Additional dimension ``...`` will be flattened into the batch dimension.
180180
181181
As output to ``forward`` and ``compute`` the metric returns the following output:

0 commit comments

Comments
 (0)