Recently, Neural ODE (Ordinary Differential Equation) models have been proposed, which use ordinary differential equation solving to predict the output of neural networks. Due to Neural ODE models’ noticeably lower parameter usage compared to traditional Deep Neural Networks (DNN) and higher robustness against gradient-based attacks, they are being adopted in many type of real-time applications. For real-time applications, response-time (latency) has paramount importance due to the convenience of the user. Through our observation, we find that the latency during Neural ODE inference can be highly dynamic and sometimes detrimental to the system due to the adaptive nature of the ODE solvers. Because of that reason, understanding and evaluating efficiency robustness of Neural ODE models is needed, which has not received much attention yet. However, evaluating efficiency robustness of any model is dependent on the relationship between input and latency, which has not been defined yet for Neural ODE models. In this work, we first formulate the relationship between input and dynamic latency consumption of Neural ODEs. Based on the formulation, we propose AntiNODE, which generates latency-surging adversarial inputs for Neural ODEs by increasing the computations in Neural ODEs. We evaluate AntiNODE on two popular datasets and three ODE solvers on both hardware dependent and independent metrics. Results show that the adversarial inputs generated by AntiNODE can decrease up to 335% efficiency during inference. Our evaluation also shows that the generated adversarial inputs are transferable across multiple solvers and multiple architectures, which indicates the feasibility of black-box attack.