pytorch hook feature map 출력하기

hook 함수는 forwork, backwork,혹은 그전에 pytorch의 함수에서 특정한 로직을 부여할 수 있는 함수입니다. 이글에서는 hook 을 이용하여 torchvison의 vgg19 모델의 feature map 의 최대 최소값과 그 결과를 저장하여보겠습니다.

먼저 라이브러리를 import하고 입력을 만듭니다.

import torch
import torchvision
import numpy as np
model=torchvision.models.vgg19(pretrained=True)
input=torch.rand(1,3,224,224)

hook 함수를 정의합니다.

def get_features_hook(self, input, output):
    _max = output.data.cpu().numpy().max()
    _min = output.data.cpu().numpy().min()
    np.save(str(self),output.data.cpu().numpy())
    print("self: ",str(self),'max: ',_max,'\nmin: ',_min)

self는 hook가 등록되는 해당 함수 input과 output은 hook이 등록되는 함수의 입출력입니다.

이제 모델의 함수들에 전부 hook을 등록합니다.

model=torchvision.models.vgg19(pretrained=True)
for i,_ in model.named_children():
    for k in range(len(model._modules.get(i))):
        model._modules.get(i)[k].register_forward_hook(get_features_hook)

모델을 실행합니다.

model(input)

결과 예시입니다.

self:  Conv2d(3, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  3.818097 
min:  -2.712644
self:  ReLU(inplace) max:  3.818097 
min:  0.0
self:  Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  10.870506 
min:  -11.56201
self:  ReLU(inplace) max:  10.870506 
min:  0.0
self:  MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False) max:  10.870506 
min:  0.0
self:  Conv2d(64, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  15.277631 
min:  -21.049593
self:  ReLU(inplace) max:  15.277631 
min:  0.0
self:  Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  28.45929 
min:  -28.66295
self:  ReLU(inplace) max:  28.45929 
min:  0.0
self:  MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False) max:  28.45929 
min:  0.0
self:  Conv2d(128, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  34.704872 
min:  -63.00056
self:  ReLU(inplace) max:  34.704872 
min:  0.0
self:  Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  31.269756 
min:  -31.316256
self:  ReLU(inplace) max:  31.269756 
min:  0.0
self:  Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  47.044506 
min:  -34.318226
self:  ReLU(inplace) max:  47.044506 
min:  0.0
self:  Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  63.32787 
min:  -46.09143
self:  ReLU(inplace) max:  63.32787 
min:  0.0
self:  MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False) max:  63.32787 
min:  0.0
self:  Conv2d(256, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  38.09463 
min:  -21.3571
self:  ReLU(inplace) max:  38.09463 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  32.68479 
min:  -28.718779
self:  ReLU(inplace) max:  32.68479 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  28.321882 
min:  -22.381243
self:  ReLU(inplace) max:  28.321882 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  15.468933 
min:  -24.071054
self:  ReLU(inplace) max:  15.468933 
min:  0.0
self:  MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False) max:  15.468933 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  11.099591 
min:  -8.545683
self:  ReLU(inplace) max:  11.099591 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  10.742858 
min:  -9.86191
self:  ReLU(inplace) max:  10.742858 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  6.1253543 
min:  -8.27225
self:  ReLU(inplace) max:  6.1253543 
min:  0.0
self:  Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1)) max:  5.9240026 
min:  -8.518373
self:  ReLU(inplace) max:  5.9240026 
min:  0.0
self:  MaxPool2d(kernel_size=2, stride=2, padding=0, dilation=1, ceil_mode=False) max:  5.9240026 
min:  0.0
self:  Linear(in_features=25088, out_features=4096, bias=True) max:  2.463348 
min:  -3.7846608
self:  ReLU(inplace) max:  2.463348 
min:  0.0
self:  Dropout(p=0.5) max:  3.7882001 
min:  0.0
self:  Linear(in_features=4096, out_features=4096, bias=True) max:  2.5730212 
min:  -4.016001
self:  ReLU(inplace) max:  2.5730212 
min:  0.0
self:  Dropout(p=0.5) max:  5.1460423 
min:  0.0
self:  Linear(in_features=4096, out_features=1000, bias=True) max:  3.6514015 
min:  -3.4203067

아래는 전체 코드입니다.

댓글 남기기