![]() Still if you could have a look at this issue we would be really grateful. Thanks for your hard work and I will keep monitoring you guys ! The problem with this is that for our company where we do a lot of work on kiosk mode pc’s for high-end architectural vizualisation, and we need multitouch enabled for doing gestures like pinch-to-zoom, or implementing a pan node by dragging two fingers across the screen etc.įor this kind of implementation we would have needed hardware touch support on windows (like on ios platforms) so we can have all touches recognized and distinct so we capture infos like position of first and second finger, delta movement etc, etc…Īnyway for now it’s kind of keeping us at bay from using ue4 on our day-to-day projects and it’s a shame because all the team got really excited on how easy and powerful it has become. Then if you set “Use Mouse for Touch” enabled, the finger at index 0 works when you touch the screen because it thinks you just pressed a mouse button. This information is managed by an input wrapper class, that allows the programmers to change between mouse and multi-touch input seamlessly in a similar way to the. It uses a c++ dll and a set of c wrappers that hook low level touch events to your Unity application. So actually when you touch the screen with one finger on tappy chicken or on my tests with “Use Mouse for Touch” disabled, no touch will be returned, hence the incompatibility with multitouch or any touch actually I was talking about. Order Microsoft Surface Pro 8 Multi-Touch 13 Tablet, WiFi, 8GB RAM, 128GB SSD, Windows 11 Home, Platinum (8PN-00001) today at and get fast. Unity pluggin to support touch screen devices for Windows 7 and Windows 8. The finger at index 0 is recognized in a touch screen with windows only if “User Mouse for Touch” property is true in the project settings. What I actually tried is to get Tappy Chicken to “jump” not when you touch with the finger index 0 (so the first touch detected on screen) but do it when the finger index is 1 (so the input event fires when the second finger touches the screen while having the first finger still touching it). Keep up the awesome work, you’re Epic guysĪctuall what I meant is that right now, it doesn’t seem possible to use the touch events and nodes in blueprint to get touch on the windows platform (unless i’m doing something very wrong). Until such support is live we regretfully won’t be able to work on apps for windows with unreal which is a shame because everything else blew our expectations ! If not do you have some ETA on this functionnality or do we someway or another have to code it ourselves… So here’s the question, is it possible as of now to catch 2 or 3 touch inputs at the same time (like on ipads) on windows 8 multitouch screens or not ? (if it’s the case i completly missed it…) Here’s the catch, we really need touch support on UE4 because without it, a lot of functionalities can’t be implemented (pinch to zoom, two fingers drag, etc…) which are expected from our clients. So I wanted to start using it for commercial work at my company (we do realtime architectural visualisations for companies or even cities) and do most of our work on ios and windows 8 on kiosk-like computers with touch. ![]() I recently switched from Unity to Unreal after seeing the progress done on the user experience and the powerful tools available that just don’t compare with what unity offers. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |