TexSenseGAN: A User-Guided System for Optimizing Texture-Related Vibrotactile Feedback Using Generative Adversarial Network
This repository contains the code for the paper: TexSenseGAN: A User-Guided System for Optimizing Texture-Related Vibrotactile Feedback Using Generative Adversarial Network
The opendataset used this paper: LMT Haptic Texture Database (108 surface materials, SoundScans, Movement)
To obtain the preprocessed dataset, run the notebook preprocess.ipynb
. In this study, we selected 14 classes to build a training dataset.
Run the TactileCAAE/train.py
to train the model. The dictionary of the trained model parameters are saved in TactileCAAE
. After loading the trained parameters, the model can be used directly for the user optimization.
Run the DSS_Experiment_UserInitialization.py
to start the optimization with the user initialization. Run the DSS_Experiment.py
to start the optimization directly.
If you find this repo is helpful, please cite:
@ARTICLE{10891204,
author={Zhang, Mingxin and Terui, Shun and Makino, Yasutoshi and Shinoda, Hiroyuki},
journal={IEEE Transactions on Haptics},
title={TexSenseGAN: A User-Guided System for Optimizing Texture-Related Vibrotactile Feedback Using Generative Adversarial Network},
year={2025},
volume={},
number={},
pages={1-15},
keywords={Vibrations;Optimization;Generative adversarial networks;Vectors;Generators;Deep learning;Training;Human in the loop;Haptic interfaces;Aerospace electronics;Haptic display;Human-computer interaction;Optimization;Deep learning;Autoencoder;Generative adversarial networks},
doi={10.1109/TOH.2025.3542424}}
This code is based on the implementations of Difference-Subspace-Search.