Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Conversation

Satrat
Copy link

@Satrat Satrat commented Oct 19, 2023

  • Updates the SparseGPT modifier to allow a quantization modifier to be defined under the SparseGPTModifier.quantization property within a recipe
  • added a qat_active function to ModifiableModel, used to determine if quantization has already been applied
  • boolean quantization is still supported, will log warnings for edge cases

Example

    SparseGPTModifier:
      sparsity: 0.5
      block_size: 128
      sequential_update: False
      quantize:
        QuantizationModifier:
          ignore: ["lm_head", "Embedding", "OPTLearnedPositionalEmbedding", "QuantizableBatchMatMul", "BMMLeftInput_QK", "BMMRightInput_QK", "BMMOutput_QK", "BMMLeftInput_PV", "BMMRightInput_PV", "BMMOutput_PV"]
          post_oneshot_calibration: True
          scheme_overrides:
            ReLU:
              input_activations: null
              output_activations: null
            LayerNorm:
              input_activations: null
              output_activations: null

Testing

Added unit tests for the different quantization conditions

bfineran
bfineran previously approved these changes Oct 20, 2023
@Satrat Satrat requested a review from bfineran October 26, 2023 20:46
@bfineran bfineran merged commit 916657c into main Oct 26, 2023
@bfineran bfineran deleted the sparsegpt_quant_child branch October 26, 2023 20:53
bfineran pushed a commit that referenced this pull request Nov 16, 2023
* basic implementation working

* qat active function and edge cases

* tests for obcq quant

* clean recipe

* docstrings for new quantization situation
bfineran pushed a commit that referenced this pull request Nov 16, 2023
* basic implementation working

* qat active function and edge cases

* tests for obcq quant

* clean recipe

* docstrings for new quantization situation
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants