How to include normalization of features in Keras regression model?

前端 未结 1 1775
有刺的猬
有刺的猬 2021-01-20 01:27

I have a data for a regression task. The independent features(X_train) are scaled with a standard scaler. Built a Keras sequential model adding hidden layers. C

1条回答
  •  北海茫月
    2021-01-20 02:23

    The standard and efficient way, as per my understanding is, to use Tensorflow Transform. It doesn't essentially mean that we should use entire TFX Pipeline if we have to use TF Transform. TF Transform can be used as a Standalone as well.

    Tensorflow Transform creates a Beam Transormation Graph, which injects these Transformations as Constants in Tensorflow Graph. As these transformations are represented as Constants in the Graph, they will be consistent across Training and Serving. Advantages of that consistency across Training and Serving are

    1. Eliminates Training-Serving Skew
    2. Eliminates the need for having code in the Serving System, which improves the latency.

    Sample Code for TF Transform is mentioned below:

    Code for Importing all the Dependencies:

    try:
      import tensorflow_transform as tft
      import apache_beam as beam
    except ImportError:
      print('Installing TensorFlow Transform.  This will take a minute, ignore the warnings')
      !pip install -q tensorflow_transform
      print('Installing Apache Beam.  This will take a minute, ignore the warnings')
      !pip install -q apache_beam
      import tensorflow_transform as tft
      import apache_beam as beam
    
    import tensorflow as tf
    import tensorflow_transform.beam as tft_beam
    from tensorflow_transform.tf_metadata import dataset_metadata
    from tensorflow_transform.tf_metadata import dataset_schema
    

    Below mentioned is the Pre-Processing function where we mention all the Transformations:

    def preprocessing_fn(inputs):
      """Preprocess input columns into transformed columns."""
      # Since we are modifying some features and leaving others unchanged, we
      # start by setting `outputs` to a copy of `inputs.
      outputs = inputs.copy()
    
      # Scale numeric columns to have range [0, 1].
      for key in NUMERIC_FEATURE_KEYS:
        outputs[key] = tft.scale_to_0_1(outputs[key])
    
      for key in OPTIONAL_NUMERIC_FEATURE_KEYS:
        # This is a SparseTensor because it is optional. Here we fill in a default
        # value when it is missing.
        dense = tf.sparse_to_dense(outputs[key].indices,
                                   [outputs[key].dense_shape[0], 1],
                                   outputs[key].values, default_value=0.)
        # Reshaping from a batch of vectors of size 1 to a batch to scalars.
        dense = tf.squeeze(dense, axis=1)
        outputs[key] = tft.scale_to_0_1(dense)
    
      return outputs
    

    In addition to

    tft.scale_to_0_1
    

    You can also use other APIs for Normalization, like

    tft.scale_by_min_max, tft.scale_to_z_score
    

    You can refer below mentioned link for the detailed information and for the Tutorial of TF Transform.

    https://www.tensorflow.org/tfx/transform/get_started

    https://www.tensorflow.org/tfx/tutorials/transform/census

    0 讨论(0)
提交回复
热议问题