In our daily lives, we interact with fluids by touching them directly with our hands. Fluids produce a pressure field against the surface of our hands, and we experience fluid dynamics over our skin temporally and spatially at varying pressure distributions depending on fluid properties as well as on the interacting hand's poses and motions. To improve the realism of fluid simulation together with user interaction, we propose a real-time fluid tactile rendering technique that computes the pressure field on a virtual hand surface to be delivered to the user's actual hand via ultrasound-based mid-air haptic display. Our haptic rendering algorithm computes the feedback force in two stages: First, the pressure distribution of the rigid-fluid interaction is computed from a real-time Lagrangian fluid simulation, and then a set of focal points that reflects the generated pressure field is extracted by using a hill-climbing method which gives the local extrema of the pressure field of simulation. We implement a real-time smoothed-particle hydrodynamics fluid simulator and the proposed haptic rendering algorithm using adaptive amplitude modulation approach to demonstrate the effectiveness of our method in fluid tactile rendering in various scenarios.