DivNAS#
Activations Analyser#
- archai.supergraph.algos.divnas.analyse_activations.create_submod_f(covariance: array) Callable [source]#
- archai.supergraph.algos.divnas.analyse_activations.rbf(x: array, y: array, sigma=0.1) array [source]#
Computes the rbf kernel between two input vectors
- archai.supergraph.algos.divnas.analyse_activations.compute_brute_force_sol(cov_kernel: array, budget: int) Tuple[Tuple[Any], float] [source]#
- archai.supergraph.algos.divnas.analyse_activations.compute_correlation(covariance: array) array [source]#
- archai.supergraph.algos.divnas.analyse_activations.compute_covariance_offline(feature_list: List[array]) array [source]#
Compute covariance matrix for high-dimensional features. feature_shape: (num_samples, feature_dim)
- archai.supergraph.algos.divnas.analyse_activations.compute_rbf_kernel_covariance(feature_list: List[array], sigma=0.1) array [source]#
Compute rbf kernel covariance for high dimensional features. feature_list: List of features each of shape: (num_samples, feature_dim) sigma: sigma of the rbf kernel
- archai.supergraph.algos.divnas.analyse_activations.compute_euclidean_dist_quantiles(feature_list: List[array], subsamplefactor=1) List[Tuple[float, float]] [source]#
Compute quantile distances between feature pairs feature_list: List of features each of shape: (num_samples, feature_dim)
- archai.supergraph.algos.divnas.analyse_activations.greedy_op_selection(covariance: array, k: int) List[int] [source]#
- archai.supergraph.algos.divnas.analyse_activations.compute_marginal_gain(y: int, A: Set[int], S: Set[int], covariance: array) float [source]#
- archai.supergraph.algos.divnas.analyse_activations.collect_features(rootfolder: str, subsampling_factor: int = 1) Dict[str, List[array]] [source]#
Walks the rootfolder for h5py files and loads them into the format required for analysis.
Inputs:
rootfolder: full path to folder containing h5 files which have activations subsampling_factor: every nth minibatch will be loaded to keep memory manageable
Outputs:
dictionary with edge name strings as keys and values are lists of np.array [num_samples, feature_dim]
Cell#
Experiment Runner#
- class archai.supergraph.algos.divnas.divnas_exp_runner.DivnasExperimentRunner(config_filename: str, base_name: str, clean_expdir=False)[source]#
- model_desc_builder() DivnasModelDescBuilder [source]#
- trainer_class() Type[ArchTrainer] | None [source]#
- finalizers() Finalizers [source]#
Finalizers#
Model Description Builder#
Rank Finalizer#
Div-Based Operators#
- class archai.supergraph.algos.divnas.divop.DivOp(op_desc: OpDesc, arch_params: ArchParams | None, affine: bool)[source]#
The output of DivOp is weighted output of all allowed primitives.
- PRIMITIVES = ['max_pool_3x3', 'avg_pool_3x3', 'skip_connect', 'sep_conv_3x3', 'sep_conv_5x5', 'dil_conv_3x3', 'dil_conv_5x5', 'none']#
- property collect_activations: bool#
- property activations: List[array] | None#
- property num_primitive_ops: int#
- forward(x)[source]#
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- ops() Iterator[Tuple[Op, float]] [source]#
Return contituent ops, if this op is primitive just return self
- finalize() Tuple[OpDesc, float | None] [source]#
Divnas with default finalizer option needs this override else the finalizer in base class returns the whole divop
- training: bool#