ranzen.torch.transforms

Classes:

MixUpMode

An enum for the mix-up mode.

PairedIndices

RandomCutMix

Randomly apply CutMix to a batch of images.

RandomMixUp

Apply mixup to tensors within a batch with some probability.

class MixUpMode(value)

Bases: Enum

An enum for the mix-up mode.

geometric = 2

geometric mix-up

linear = 1

linear mix-up

class PairedIndices(anchors, matches)

Bases: NamedTuple

Parameters:
  • anchors (Tensor)

  • matches (Tensor)

anchors: Tensor

Alias for field number 0

matches: Tensor

Alias for field number 1

class RandomCutMix(alpha=1.0, *, p=1.0, num_classes=None, inplace=False, generator=None)

Bases: object

Randomly apply CutMix to a batch of images. PyTorch implementation of the the CutMix image-augmentation strategy.

This implementation samples the bounding-box coordinates independently for each pair of samples being mixed, and, unlike other implementations, does so in a way that is fully-vectorised.

Note

This implementation randomly mixes images within a batch.

Parameters:
  • alpha (float) – hyperparameter of the Beta distribution used for sampling the areas of the bounding boxes.

  • p (float) – The probability with which the transform will be applied to a given sample.

  • num_classes (int | None) – The total number of classes in the dataset that needs to be specified if wanting to mix up targets that are label-enoded. Passing label-encoded targets without specifying num_classes will result in a RuntimeError.

  • inplace (bool) – Whether the transform should be performed in-place.

  • generator (Generator | None) – Pseudo-random-number generator to use for sampling. Note that torch.distributions.Beta does not accept such generator object and so the sampling procedure is only partially deterministic as a function of it.

Raises:

ValueError – if p is not in the range [0, 1] , if num_classes < 1, or if alpha is not a positive real number.

class RandomMixUp(lambda_sampler, *, mode=MixUpMode.linear, p=1.0, num_classes=None, featurewise=False, inplace=False, generator=None)

Bases: Generic[LS]

Apply mixup to tensors within a batch with some probability.

PyTorch implemention of mixup. This implementation allows for transformation of the the input in the absence of labels (this is relevant, for instance,to contrastive methods that use mixup to generate different views of samples to enable instance-discrimination) and additionally allows for different lambda-samplers, different methods for mixing up samples (linear vs. geometric) based on lambda, and selective pair-sampling. Furthermore, unlike the official implementation, samples are guaranteed not to be paired with themselves.

Note

This implementation randomly mixes images within a batch.

Parameters:
  • lambda_sampler (LS) – The distribution from which to sample lambda (the mixup interpolation parameter).

  • mode (MixUpMode | str) –

    Which mode to use to mix up samples: geometric or linear.

    Note

    The (weighted) geometric mean, enabled by mode=geometric, is only valid for positive inputs.

  • p (float) – The probability with which the transform will be applied to a given sample.

  • num_classes (int | None) – The total number of classes in the dataset that needs to be specified if wanting to mix up targets that are label-enoded. Passing label-encoded targets without specifying num_classes will result in a RuntimeError.

  • featurewise (bool) –

    Whether to sample sample feature-wise instead of sample-wise.

    Note

    If the lambda_sampler is a BernoulliDistribution, then featurewise sampling will always be enabled.

  • inplace (bool) – Whether the transform should be performed in-place.

  • generator (torch.Generator | None) – Pseudo-random-number generator to use for sampling. Note that torch.distributions.Distribution does not accept such generator object and so the sampling procedure is only partially deterministic as a function of it.

Raises:

ValueError – if p is not in the range [0, 1] or num_classes < 1.

classmethod with_bernoulli_dist(prob_1=0.5, *, mode=MixUpMode.linear, p=1.0, num_classes=None, inplace=False, generator)

Instantiate a RandomMixUp with a Bernoulli-distribution sampler.

Parameters:
  • prob_1 (float) – The probability of sampling 1.

  • mode (MixUpMode | str) – Which mode to use to mix up samples: geometric or linear.

  • p (float)

  • num_classes (int | None)

  • inplace (bool)

  • generator (Generator | None)

Return type:

RandomMixUp[Bernoulli]

Note

The (weighted) geometric mean, enabled by mode=geometric, is only valid for positive inputs.

Parameters:
  • p (float) – The probability with which the transform will be applied to a given sample.

  • num_classes (int | None) – The total number of classes in the dataset that needs to be specified if wanting to mix up targets that are label-enoded. Passing label-encoded targets without specifying num_classes will result in a RuntimeError.

  • inplace (bool) – Whether the transform should be performed in-place.

  • generator (Generator | None) – Pseudo-random-number generator to use for sampling. Note that torch.distributions.Distribution does not accept such generator object and so the sampling procedure is only partially deterministic as a function of it.

  • prob_1 (float)

  • mode (MixUpMode | str)

Returns:

A RandomMixUp instance with lambda_sampler set to a Bernoulli-distribution with probs=prob_1.

Return type:

RandomMixUp[Bernoulli]

classmethod with_beta_dist(alpha=0.2, *, beta=None, mode=MixUpMode.linear, p=1.0, num_classes=None, inplace=False, featurewise=False, generator=None)

Instantiate a RandomMixUp with a Beta-distribution sampler.

Parameters:
  • alpha (float) – 1st concentration parameter of the distribution. Must be positive

  • beta (float | None) – 2nd concentration parameter of the distribution. Must be positive. If None, then the parameter will be set to alpha.

  • mode (MixUpMode | str) –

    Which mode to use to mix up samples: geometric or linear.

    Note

    The (weighted) geometric mean, enabled by mode=geometric, is only valid for positive inputs.

  • p (float) – The probability with which the transform will be applied to a given sample.

  • num_classes (int | None) – The total number of classes in the dataset that needs to be specified if wanting to mix up targets that are label-enoded. Passing label-encoded targets without specifying num_classes will result in a RuntimeError.

  • inplace (bool) – Whether the transform should be performed in-place.

  • featurewise (bool) – Whether to sample sample feature-wise instead of sample-wise.

  • generator (Generator | None) – Pseudo-random-number generator to use for sampling. Note that torch.distributions.Distribution does not accept such generator object and so the sampling procedure is only partially deterministic as a function of it.

Returns:

A RandomMixUp instance with lambda_sampler set to a Beta-distribution with concentration1=alpha and concentration0=beta.

Return type:

RandomMixUp[Beta]

classmethod with_uniform_dist(low=0.0, *, high=1.0, mode=MixUpMode.linear, p=1.0, num_classes=None, inplace=False, featurewise=False, generator)

Instantiate a RandomMixUp with a uniform-distribution sampler.

Parameters:
  • low (float) – Lower range (inclusive).

  • high (float) – Upper range (inclusive).

  • mode (MixUpMode | str) –

    Which mode to use to mix up samples: geometric or linear.

    Note

    The (weighted) geometric mean, enabled by mode=geometric, is only valid for positive inputs.

  • p (float) – The probability with which the transform will be applied to a given sample.

  • num_classes (int | None) – The total number of classes in the dataset that needs to be specified if wanting to mix up targets that are label-enoded. Passing label-encoded targets without specifying num_classes will result in a RuntimeError unless num_classes is specified at call-time.

  • inplace (bool) – Whether the transform should be performed in-place.

  • featurewise (bool) – Whether to sample sample feature-wise instead of sample-wise.

  • generator (Generator | None) – Pseudo-random-number generator to use for sampling. Note that torch.distributions.Distribution does not accept such generator object and so the sampling procedure is only partially deterministic as a function of it.

Returns:

A RandomMixUp instance with lambda_sampler set to a Uniform-distribution with low=low and high=high.

Return type:

RandomMixUp[Uniform]