R

March 5, 2026

Adding Web Haptics to my Portfolio

8 min read

Open in ChatGPT

Modern interfaces are visual, but the best mobile experiences also feel physical. In this post I explain how I added subtle haptic feedback to my portfolio using the web-haptics library and how it improves interaction feedback.

Adding Web Haptics to my Portfolio

Why I Added Haptics to My Portfolio

Most websites rely entirely on visual feedback — color changes, animations, or transitions.
But modern mobile devices allow us to add tactile feedback using vibration.

When you press a button in a good mobile app, you often feel a small vibration confirming the action. That tiny detail makes the interface feel responsive and physical.

While working on my portfolio, I wanted interactions like:

  • button clicks
  • toggles
  • form submissions
  • picker selections

to feel more alive on mobile devices.

So I experimented with the web-haptics library.


What is Web Haptics?

web-haptics is a lightweight library that adds haptic feedback to web apps using the Web Vibration API.

Key things I liked about it:

  • Zero dependencies
  • Very small
  • Works across frameworks
  • Gracefully does nothing on unsupported devices

If a device does not support vibration (like most desktops), the library simply no-ops silently. That means no special feature detection is needed.

Installation is simple:


Additionally for agents you can get the skills file from : https://github.com/lochie/web-haptics/blob/main/SKILL.md


Setting Up Web Haptics in a Next.js / React Project

Since my portfolio is built with Next.js and React, I used the React integration provided by the library.

The library provides a hook called useWebHaptics, which makes it easy to trigger vibrations from React components.

However, I wanted a small abstraction layer so I could attach additional UI feedback whenever haptics are triggered.

For example:

  • shake the favicon
  • enable debug vibration on desktop
  • keep all haptic logic in one place

So I created a custom hook called useHaptics.


Creating a Custom Haptics Hook

This hook wraps the library and adds a little extra behavior.


This gives me a single unified way to trigger haptics anywhere in the app.

Now any component can simply call:



Using Haptics in a Component

Here is a simple example of triggering haptic feedback when a button is clicked.


When the button is pressed on a supported device, the phone produces a short vibration.

This tiny feedback makes interactions feel much more responsive.


Creating Custom Haptic Patterns

One of the most interesting features of the library is that you can define custom vibration patterns.

You can experiment with patterns using the online playground:

https://haptics.lochie.me/

For example, I created a small error vibration pattern like this:


This produces a quick triple vibration, which feels similar to an error or failed action in many mobile apps.

Small details like this help communicate state and feedback to the user without relying purely on visuals.


Conclusion

Adding haptics to my portfolio was a small experiment, but it made the interface feel much more tactile and responsive on mobile devices.

Instead of relying only on visual animations, the interface now physically responds to user actions.

The web-haptics library made this extremely easy to implement, and with a small custom hook I was able to integrate it cleanly into my React components.

If you are building mobile-first web experiences, subtle haptic feedback can significantly improve how interactions feel.

Sometimes the best UI improvements are the ones users can feel, not just see.

llms.txt