This code infers predictions on test data using a trained GRU model. It first reads a sample submission file and rearranges its rows to match processed test files. Then, it iterates through each test file, constructs the GRU model, loads the test data, and infers predictions. These predictions are saved to the sample submission file. Finally, the code frees up memory for efficient resource utilization.

'This code snippet is part of a larger script that has other conditional blocks for different tasks. It's executed only if the 'INFER_TEST' variable is set to 'True'.

if INFER_TEST:
    # INFER TEST DATA
    start = 0; end = 0
    sub = cudf.read_csv('../input/amex-default-prediction/sample_submission.csv')
    
    # REARANGE SUB ROWS TO MATCH PROCESSED TEST FILES
    sub['hash'] = sub['customer_ID'].str[-16:].str.hex_to_int().astype('int64')
    test_hash_index = cupy.load(f'{PATH_TO_DATA}test_hashes_data.npy')
    sub = sub.set_index('hash').loc[test_hash_index].reset_index(drop=True)
    
    for k in range(NUM_FILES):
        # BUILD MODEL
        K.clear_session()
        model = build_model()
        
        # LOAD TEST DATA
        print(f'Inferring Test_File_{k+1}')
        X_test = np.load(f'{PATH_TO_DATA}test_data_{k+1}.npy')
        end = start + X_test.shape[0]

        # INFER 5 FOLD MODELS
        model.load_weights(f'{PATH_TO_MODEL}gru_fold_1.h5')
        p = model.predict(X_test, batch_size=512, verbose=0).flatten() 
        for j in range(1,5):
            model.load_weights(f'{PATH_TO_MODEL}gru_fold_{j+1}.h5')
            p += model.predict(X_test, batch_size=512, verbose=0).flatten()
        p /= 5.0

        # SAVE TEST PREDICTIONS
        sub.loc[start:end-1,'prediction'] = p
        start = end
        
        # CLEAN MEMORY
        del model, X_test, p
        gc.collect()
Inferring Predictions on Test Data Using a Trained GRU Model

原文地址: http://www.cveoy.top/t/topic/oFR8 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录