In order to assemble a part (of e.g., an engine), a human hand must obtain complete control of its motion through application of forces and toques at multiple contact points. Today, it is often time-consuming to synthesize a good hand grasp of a part using Digital Human Modeling (DHM) tools because these tools require detailed manual inputs from a user such as manually placing a digital hand around a feasible grasp location and then closing the fingers around the part. In a previous paper, we presented two different methods (i.e., Pointwise Shortest Distance and Environment Clearance) to color part surfaces by taking environmental distance constraints into account so that a user such as an assembly simulation expert can easily identify feasible grasp locations. Due to the robustness of the implementation, even triangle meshes with common geometric flaws such as cracks and gaps can be handled. In this paper, we leverage on this feasibility analysis and present a user-guided grasp planning approach that significantly speeds up the grasp modeling process. First, the user selects a predefined grip type and then sets an approach direction for the hand. To synthesize many grasps, we randomly sample the hand’s rotation around the approach direction. Next, the hand is moved towards the part until the hand’s Grasp Center Point (GCP) reaches the geometry of the part or a collision between the hand and the part is detected. If a collision was detected, we move the hand backwards until there is no collision between the hand and the part anymore. Finally, we close the hand’s fingers around the part to synthesize a grasp. In this way, we can quickly synthesize a multitude of grasps and let the user choose among the ones with the best grasp qualities, where each grasp quality is computed using the corresponding 6D grasp wrench hull. We believe that this user-guided grasp planning approach can significantly enhance DHM tools such as Intelligently Moving Manikins (IMMA) when it comes to user usability.