Can we use from_torch for a sub var?

I have a var a with shape (time, grid_size, grid_size) and I have a torch tensor b with shape (grid_size, grid_size). I would like to ask if I can use a[0].from_torch(b)?.

And can I use the same on to_torch like b = a[0].to_torch()?

And can I use the same on fill like a[0].fill(0)?

The answers are all no. Please implement kernels for them. If that leads to duplicated code, please consider using kernel templates.