SqueezeNet#
eqxvision.models.SqueezeNet
#
A simple port of torchvision.models.squeezenet
__init__(self, version: str = '1_0', num_classes: int = 1000, dropout: float = 0.5, *, key: Optional[jax.random.PRNGKey] = None)
#
Arguments:
version
: Specifies the version of the network. Defaults to1_0
num_classes
: Number of classes in the classification task. Also controls the final output shape(num_classes,)
. Defaults to1000
dropout
: The probability parameter forequinox.nn.Dropout
key
: Ajax.random.PRNGKey
used to provide randomness for parameter initialisation. (Keyword only argument.)
__call__(self, x: Array, *, key: jax.random.PRNGKey) -> Array
#
Arguments:
x
: The input. Should be a JAX array with3
channelskey
: Required parameter. Utilised by few layers such asDropout
orDropPath
eqxvision.models.squeezenet1_0(torch_weights: str = None, **kwargs: Any) -> SqueezeNet
#
SqueezeNet model architecture from the SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size paper. The required minimum input size of the model is 21x21.
Arguments:
torch_weights
: APath
orURL
for thePyTorch
weights. Defaults toNone
eqxvision.models.squeezenet1_1(torch_weights: str = None, **kwargs: Any) -> SqueezeNet
#
SqueezeNet 1.1 model from the (official SqueezeNet repo) [https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1]. SqueezeNet 1.1 has 2.4x less computation and slightly fewer parameters than SqueezeNet 1.0, without sacrificing accuracy. The required minimum input size of the model is 17x17.
Arguments:
torch_weights
: APath
orURL
for thePyTorch
weights. Defaults toNone