-
I've converted the type of Any clues or suggestions regarding this refactoring is appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 3 replies
-
Can you point us to the point where exactly this changed should happen? |
Beta Was this translation helpful? Give feedback.
-
@rjavadi what are you aiming to achieve with this? the surrogate modeul seems close to the bayesian optimziation and hence it seems should use tensors |
Beta Was this translation helpful? Give feedback.
-
@Scienfitz In our meetings we discussed that the only surrogate model that requires using tensors is gaussian process, so I changed the base class to use ndarray instead of torch. I may be wrong :) it's also less work for me if we only need to load torch lazily here 😅 |
Beta Was this translation helpful? Give feedback.
-
closing as no longer relevant |
Beta Was this translation helpful? Give feedback.
hmm I'm not sure that was correct, I think its more like: anything that is used inside
acquisition
can keep using torch. While the surrogates are strictly speaking not derived of acquisition, they are only ever needed if acquisition is also used so I guess it would be fine to keep them having torch (lazy)Re your original question:
Did you try with reshape?
ie a ndarray of shape
[2,4,3,4]
would normally be flattened to[96]
but you could just flatten everything except the last dimensions (according to whatend_dim
was chosen), for example hereend_dim=-2
ignoring the last dimension resulting in shape[24, 4]
. I would hope this creates the right orderWe need to keep in mind the performanc…