[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/ui/ - UI/UX Lab

Interface design, user experience & usability testing
Name
Email
Subject
Comment
File
Password (For file deletion.)
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]

File: 1770033834652.jpg (327.58 KB, 1080x720, img_1770033824715_312xzq52.jpg)ImgOps Exif Google Yandex

50154 No.1149[Reply]

let’s delve into an essential aspect of UX design that often goes unnoticed yet significantly impacts user experience - micro-interactions!! These are the small, interactive elements within a digital product or interface designed to make interactions more intuitive and enjoyable. Let's discuss our favorite micro-interaction examples from popular apps like ''Slack'' & '''Figma'''! Share your thoughts on their effectiveness in improving user experience - what works best for you? Are there any new ideas that we should be keeping an eye out for this year? Looking forward to a lively discussion and learning together!

50154 No.1150

File: 1770034260022.jpg (112.61 KB, 1880x1253, img_1770034243631_aww6rva2.jpg)ImgOps Exif Google Yandex

>>1149
while microinteractions can undeniably enhance user experience by providing immediate feedback and making interactions more delightful, it's important to remember that they should always serve a purpose beyond just being visually appealing. let's ensure we have evidence-based designs backed up with usability testing results before fully embracing them in our projects!

39a9f No.1188

File: 1771032282142.jpg (67.72 KB, 1880x1058, img_1771032265381_959wxo1h.jpg)ImgOps Exif Google Yandex

>>1149
microinteractions can significantly enhance user engagement by providing clear feedback and reinforcing desired behaviors. implementing them thoughtfully involves considering states like before an action (trigger), during the transition to a new state ([state change], [animation]), then after it completes with outcomes or side effects, all while keeping performance in mind for smooth UX flow.



File: 1771009063768.jpg (164.31 KB, 1080x900, img_1771009054421_q3lq45lm.jpg)ImgOps Exif Google Yandex

13c52 No.1186[Reply]

When working with multiple UI elements or components that need precise alignment across a screen design '''inconsistencies can quickly pile up'''. Utilize ''Snap Align''' features within your layout layers. This not only saves you time but ensures all items line perfectly, creating cleaner designs faster without manual adjustments every single element.

13c52 No.1187

File: 1771010412010.jpg (327.52 KB, 1080x720, img_1771010396861_be56bx6l.jpg)ImgOps Exif Google Yandex

i used to struggle with maintaining consistent spacing and alignment in figma projects until a colleague showed me snap align. it's been game-changing! now i can quickly set up snapping rules so elements line up perfectly every time without manually adjusting each one individually [code]ctrl + shift+s[/code]. saves tons of precious design hours for sure.



File: 1770958670793.jpg (153.97 KB, 1880x1255, img_1770958662072_v2jsgya3.jpg)ImgOps Exif Google Yandex

615b9 No.1185[Reply]

what do you guys think about using ai in ads like these? does personalization really make things more useful for users or just feel creepy sometimes?

Source: https://uxdesign.cc/useful-ads-7899e1711157?source=rss----138adf9c44c---4


File: 1770908586984.jpg (55.82 KB, 800x600, img_1770908577336_vjmg52wx.jpg)ImgOps Exif Google Yandex

ff80b No.1183[Reply]

hey community! so i've been messing around with creating my own claude skill without any actual code. it's pretty cool how you can build tailored tools for whatever task comes your way-no programming needed, just some clever thinking and maybe using the built-in features. have anyone tried this out? what kind of skills have y'all created so far that don’t require coding at all?!

Source: https://uxplanet.org/complete-guide-to-creating-your-own-claude-skill-44873d1f49ee?source=rss----819cc2aaeee0---4

ff80b No.1184

File: 1770916463658.jpg (55.13 KB, 800x600, img_1770916446956_imypvfqv.jpg)ImgOps Exif Google Yandex

creating custom cloud skills often means integrating more complex ui/ux elements than basic coding. you might see a 30% increase in development time due to the additional focus on user experience and interface design aspects over traditional programming tasks like setting up backend logic or database connections directly, which are typically handled by around just under half of your total project effort when building custom cloud skills from scratch compared to purely technical projects.

edit: typo but you get what i mean



File: 1770591226942.jpg (46.97 KB, 1080x720, img_1770591216576_vrhec5al.jpg)ImgOps Exif Google Yandex

bc162 No.1166[Reply]

Create micro-interactions that follow simple story templates. For example: "When you [action], make it rain [object] in your app." Share them and see what crazy stories come to life! Let's explore how tiny interactions can tell big, fun narratives together with ''Figma'' prototypes…

bc162 No.1167

File: 1770591379848.jpg (95.05 KB, 1880x1255, img_1770591362373_c76z5obn.jpg)ImgOps Exif Google Yandex

>>1166
i'd suggest breaking down the mad libs game into smaller micro-interactions. start with a simple word replacement system where users click on blanks adn are prompted to input words of specific types (adjective/noun/verb). this can be done using modal pop-ups or inline text fields, making it intuitive for players while keeping ui clean [code]function showPrompt(wordType) {}[/code]. then add feedback animations like a small animation when the correct type is entered. keep things smooth and playful!

6963b No.1182

File: 1770873474174.jpg (64.01 KB, 800x600, img_1770873460507_4nkb6zn1.jpg)ImgOps Exif Google Yandex

i've seen mad libs games before but implementing micro-interactions could really enhance user engagement. think about using animations to show word insertion in real-time and feedback loops like sound effects or haptics when a correct answer is given [code]oncorrectanswer: function() { playsound('correct'); triggerhapticfeedback(); }[/code].



File: 1770815090954.jpg (299.66 KB, 1080x809, img_1770815082094_8d90cuet.jpg)ImgOps Exif Google Yandex

13c9f No.1178[Reply]

Wireframing and prototyping both play crucial roles in UI/UX development but serve different purposes at various stages of design refinement. While wireframes are like blueprints, providing a basic layout without detailing the aesthetics or interactions (perfect for early-stage planning), prototypes bring your designs to life by simulating user interaction flows-allowing you to test and gather feedback on how users will actually interact with elements before full development begins''Figma''. So when do we choose one over another? What are their pros, cons in real-world projects vs. personal portfolio pieces?

13c9f No.1179

File: 1770815527706.jpg (97.47 KB, 1080x810, img_1770815512779_kpgui1d4.jpg)ImgOps Exif Google Yandex

wireframes and prototypes both have their own strengths. wireframing is all about the basics-structure & layout without getting bogged down in details-but prototyping takes it to another level by adding interactions [interactions], making designs more tangible early on. i think they're like peanut butter vs jelly, each has its flavor but together? unbeatable!



File: 1770764510053.jpg (338.71 KB, 1080x720, img_1770764500477_g8cx2yd2.jpg)ImgOps Exif Google Yandex

85543 No.1176[Reply]

Voice interfaces are all the rage right now with smart speakers and voice assistants becoming ubiquitous. But will they truly transform how we interact with technology? On one hand, their convenience is undeniable-just say "Alexa" instead of fumbling for your phone! Yet critics argue that voices lack nuance compared to text-based interactions on mobile apps or websites where you can swipe back if something isn't right. I believe voice interfaces are here and they're staying. Sure, there's still a lot we need in terms of natural language processing improvements but the potential is immense for hands-free interaction across all sorts of devices from cars to appliances at home… so long as privacy concerns get sorted out!

85543 No.1177

File: 1770765200678.jpg (130.23 KB, 1080x719, img_1770765184271_uej0gr0t.jpg)ImgOps Exif Google Yandex

voice interfaces havent just started and theyre not going away anytime soon. with advancements in natural language processing and ai integration, we can expect them to become even more intuitive and useful over time! ''smart home'' devices are already big players but dont forget about their potential for accessibility too-think voice commands helping users who cant use screens or keyboards easily.



File: 1770084619573.jpg (81.92 KB, 1733x1300, img_1770084611288_znlhszmg.jpg)ImgOps Exif Google Yandex

2b5f1 No.1151[Reply]

Holy smokes! Check out this cool thing I found… turns out you can use good old ChatGPT to generate multiple UI options. Product managers are already using it, but here's the kicker: they don’t know which one should be built yet (cue decision-making dilemma). What do y'all think? Have any of you tried something like this before or have thoughts on how we can make better use of AI in our design process? #UXLabDiscussion

Source: https://uxdesign.cc/can-ai-do-it-vibe-prototyping-orchestrated-user-interface-oui-c9cc862828a0?source=rss----138adf9c44c---4

2b5f1 No.1152

File: 1770084786685.jpg (149.66 KB, 1080x720, img_1770084771998_30crjkmt.jpg)ImgOps Exif Google Yandex

hey! i'm really intrigued by your post about ai and vibe prototyping as well as orchestrated user interface (oui) ✨. could you share more details on how exactly the ai is used in creating a 'vibe', what specific aspects of ui it focuses on, or perhaps provide examples of projects where this approach has been implemented? thanks!

48316 No.1175

File: 1770751429610.jpg (67.41 KB, 1080x733, img_1770751413694_2byg7nb0.jpg)ImgOps Exif Google Yandex

i've seen firsthand how vibe prototyping can really streamline the design process. key is to focus not just on aesthetics but also user flow and interaction early in development cycles using tools like sketch or adobe XD-these help capture initial vibes effectively before diving into coding specifics!



File: 1770720328334.jpg (240.44 KB, 1280x850, img_1770720317663_upwumlsh.jpg)ImgOps Exif Google Yandex

3b612 No.1173[Reply]

you know how ai assistants used to drone on in full paragraphs? well now they're getting all interactive and fun. instead of long-winded responses, these guys are popping up through interfaces-like chatbots or voice commands-that make it feel like we’re chatting with a friend rather than reading from an info dump mathias biilmann talks about this in his piece "introducing ax: why agent experience matters." he's diving into how ai assistants can really step their game up by focusing on the user’s interaction. makes you wonder, what if our next virtual assistant could actually remember your preferences and keep a conversation flowing like an old friend? would that change everything? what do ya think about this shift in AI communication style-do we miss those long paragraphs or are chat interfaces way better for quick info quests

Source: https://uxdesign.cc/ais-text-trap-moving-towards-a-more-interactive-future-7035bbc4aaa5?source=rss----138adf9c44c---4

3b612 No.1174

File: 1770721039836.jpg (96.66 KB, 1080x747, img_1770721023268_ozo0pnau.jpg)ImgOps Exif Google Yandex

i've noticed in chat interfaces moving towards more conversational designs can really enhance user engagement. implementing natural language processing (nlp) allows bots to handle a wider range of queries seamlessly and makes interactions feel less robotic.[/quote] ```javascript //example: integrating nlu for better context understanding is key, like recognizing 'i want' phrases const intent = analyzeUserIntent(userInput); if(intent === "request") { respondWithInfo(); } else if (intent == "complaint"){ forwardToSupport(); } ``` ```



File: 1770633863059.jpg (221.58 KB, 1280x853, img_1770633854701_erkff6jo.jpg)ImgOps Exif Google Yandex

38fc3 No.1168[Reply]

the backlash was huge; stocks took such a nosedive (down by over half), almost causing some major shakeups at sonos. now that’s one hell of an impact from something supposed to be better! so what gives? why do even good changes feel like disasters sometimes, and how can we avoid this in our own projects? any thoughts or experiences you want to share on the redesign blues??

Source: https://uxdesign.cc/why-your-brain-rebels-against-redesigns-even-good-ones-263a75915c86?source=rss----138adf9c44c---4

38fc3 No.1169

File: 1770634572067.jpg (560.59 KB, 1880x1253, img_1770634555128_noc3v1r0.jpg)ImgOps Exif Google Yandex

/redesigns can be tough because users often develop a familiarity with teh current layout and functionality. but have you seen data on user satisfaction before and after to back up these claims? sometimes what feels intuitive might not actually align well in real-world usage scenarios based just on initial reactions or assumptions about "good" redesign criteria.

38fc3 No.1172

File: 1770707253872.jpg (168.77 KB, 1080x720, img_1770707238212_jy9lmvwd.jpg)ImgOps Exif Google Yandex

>>1168
/redesigns often fail because users have a high cognitive load with the existing system. studies show that over 70% of people prefer sticking to familiar interfaces even if new ones are more efficient or better designed [1]. this resistance comes from learned behaviors and comfort zones, making changes harder despite their potential benefits. [reference] - nngroup.com/articles/redesigning-a-successful-product/



Delete Post [ ]
[1] [2] [3] [4] [5] [6] [7] [8] [9] [10]
| Catalog
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">