We present a framework from vision based hand movement prediction in a real-world human-robot collaborative scenario for safety guarantee. We first propose a perception submodule that takes in visual data solely and predicts human collaborator's hand movement. Then a robot trajectory adaptive planning submodule is developed that takes the noisy movement prediction signal into consideration for optimization. To validate the proposed systems, we first collect a new human manipulation dataset that can supplement the previous publicly available dataset with motion capture data to serve as the ground truth of hand location. We then integrate the algorithm with a six degree-of-freedom robot manipulator that can collaborate with human workers on a set of trained manipulation actions, and it is shown that such a robot system outperforms the one without movement prediction in terms of collision avoidance. We verify the effectiveness of the proposed motion prediction and robot trajectory planning approaches in both simulated and physical experiments. To the best of the authors' knowledge, it is the first time that a deep model based movement prediction system is utilized and is proven effective in human-robot collaboration scenario for enhanced safety.