test rnn

The Models in test rnn is for Huggingface Candle PR#2542 as example test cases.

Test models are refered to Pytorch LSTM and GRU.

Test models are generated by the following codes:

  • lstm_test.pt: A simple LSTM model with 1 layer.

    import torch
    import torch.nn as nn
    
    rnn = nn.LSTM(10, 20, num_layers=1, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, (hn, cn) = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    state_dict['cn'] = cn
    torch.save(state_dict, "lstm_test.pt")
    
  • gru_test.pt: A simple GRU model with 1 layer.

    import torch
    import torch.nn as nn
    
    rnn = nn.GRU(10, 20, num_layers=1, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, hn = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    torch.save(state_dict, "gru_test.pt")
    
  • bi_lstm_test.pt: A bidirectional LSTM model with 1 layer.

    import torch
    import torch.nn as nn
    
    rnn = nn.LSTM(10, 20, num_layers=1, bidirectional=True, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, (hn, cn) = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    state_dict['cn'] = cn
    torch.save(state_dict, "bi_lstm_test.pt")
    
  • bi_gru_test.pt: A bidirectional GRU model with 1 layer.

    import torch
    import torch.nn as nn
    
    rnn = nn.GRU(10, 20, num_layers=1, bidirectional=True, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, hn = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    torch.save(state_dict, "bi_gru_test.pt")
    
  • lstm_nlayer_test.pt: A LSTM model with 3 layers.

    import torch
    import torch.nn as nn
    
    rnn = nn.LSTM(10, 20, num_layers=3, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, (hn, cn) = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    state_dict['cn'] = cn
    torch.save(state_dict, "lstm_nlayer_test.pt")
    
  • bi_lstm_nlayer_test.pt: A bidirectional LSTM model with 3 layers.

    import torch
    import torch.nn as nn
    
    rnn = nn.LSTM(10, 20, num_layers=3, bidirectional=True, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, (hn, cn) = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    state_dict['cn'] = cn
    torch.save(state_dict, "bi_lstm_nlayer_test.pt")
    
  • gru_nlayer_test.pt: A GRU model with 3 layers.

    import torch
    import torch.nn as nn
    
    rnn = nn.GRU(10, 20, num_layers=3, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, hn = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    torch.save(state_dict, "gru_nlayer_test.pt")
    
  • bi_gru_nlayer_test.pt: A bidirectional GRU model with 3 layers.

    import torch
    import torch.nn as nn
    
    rnn = nn.GRU(10, 20, num_layers=3, bidirectional=True, batch_first=True)
    input = torch.randn(5, 3, 10)
    output, hn = rnn(input)
    
    state_dict = rnn.state_dict()
    state_dict['input'] = input
    state_dict['output'] = output.contiguous()
    state_dict['hn'] = hn
    torch.save(state_dict, "bi_gru_nlayer_test.pt")
    
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support