Purpose: Invalidity Analysis


Patent: US9645663B2
Filed: 2013-03-24
Issued: 2017-05-09
Patent Holder: (Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC
Inventor(s): Sergey Mavrody

Title: Electronic display with a virtual bezel

Abstract: An electronic device with a touchscreen display is provided comprising an active touchscreen region and a virtual bezel area, the active touchscreen region functioning to process a first set of touch-based inputs from a user of the electronic device according to a first mode of operation, and the virtual bezel area functioning to process a second set of touch-based inputs from a user of the electronic device according to a second mode of operation.




Disclaimer: The promise of Apex Standards Pseudo Claim Charting (PCC) [ Request Form ] is not to replace expert opinion but to provide due diligence and transparency prior to high precision charting. PCC conducts aggressive mapping (based on Broadest Reasonable, Ordinary or Customary Interpretation and Multilingual Translation) between a target patent's claim elements and other documents (potential technical standard specification or prior arts in the same or across different jurisdictions), therefore allowing for a top-down, apriori evaluation, with which, stakeholders can assess standard essentiality (potential strengths) or invalidity (potential weaknesses) quickly and effectively before making complex, high-value decisions. PCC is designed to relieve initial burden of proof via an exhaustive listing of contextual semantic mapping as potential building blocks towards a litigation-ready work product. Stakeholders may then use the mapping to modify upon shortlisted PCC or identify other relevant materials in order to formulate strategy and achieve further purposes.

Click on references to view corresponding claim charts.


Non-Patent Literature        WIPO Prior Art        EP Prior Art        US Prior Art        CN Prior Art        JP Prior Art        KR Prior Art       
 
  Independent Claim

GroundReferenceOwner of the ReferenceTitleSemantic MappingBasisAnticipationChallenged Claims
123456789101112131415161718
1

FIFTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS. : 423-428 2002

(Bretzner, 2002)
Kungliga Tekniska högskolan (KTH Royal Institute of Technology Sweden)Hand Gesture Recognition Using Multi-scale Colour Features, Hierarchical Models And Particle Filtering electronic device, electronic device status display panel hand gestures

operating system status bar image feature

XXXXXXXXXX
2

CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, VOLS 1 AND 2. : 657-666 2007

(Vogel, 2007)
University of Toronto, Ontario, CanadaShift: A Technique For Operating Pen-Based Interfaces Using Touch user input visual feedback

electronic device status display panel touch screen

XXXXXXX
3

UIST 2007: PROCEEDINGS OF THE 20TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY. : 269-278 2007

(Wigdor, 2007)
Mitsubishi Electric Research LaboratoriesLucidTouch: A See-Through Mobile Device display system higher precision

s hand s hand

XXXXXXXXXXXXX
4

2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP). : 4205-4208 2012

(Rybach, 2012)
Rheinisch-Westfälische Technische Hochschule Aachen (RWTH Aachen University)SILENCE IS GOLDEN: MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS first mode speech recognition

s hand recognition system

XXXX
5

2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP). : 4125-4128 2012

(Sundaram, 2012)
Deutsche Telekom LabsLATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION first mode speech recognition

first portion temporal alignment

second mode new frame

XXXX
6

JOURNAL OF MICROMECHANICS AND MICROENGINEERING. 20 (7): - JUL 2010

(Takamatsu, 2010)
東京大学, Tōkyō daigaku (The University of Tokyo)Transparent Conductive-polymer Strain Sensors For Touch Input Sheets Of Flexible Displays receiving touch flexible display

status bar visibility curved surface

touchscreen layer panel display

XXX
7

INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5. : 352-355 2009

(Harb, 2009)
Google IncBack-Off Language Model Compression third set compression algorithm

first mode speech recognition

status bar visibility two dimensions

XXXXX
8

PROCEDINGS OF THE 11TH IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING. : 131-136 2007

(Guzzoni, 2007)
The École polytechnique fédérale de Lausanne (EPFL)Active, A Tool For Building Intelligent User Interfaces first mode speech recognition

electronic device, electronic device status display panel hand gestures

XXXXXXXXXX
9

WO2013012667A1

(Wei Chen, 2013)
(Original Assignee) Apple Inc.     Touch sensitive displays electronic device electronic device

display system, electronic device status display panel control signals

holding pattern front surface

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches the low impedance segment has zero ohms for purposes of circuit analysis when analyzing a circuit containing…

teaches using a black material to form the insulator as a way of preventing reflection of unwanted light off of…

discloses wherein the protecting layer spacer is provided as DLC or as silicon nitride thus exemplifying recognized…

teaches a similar semiconductor device wherein a flexible substrate is formed from metal or plastic…
XXXXXXXXXXXXXXXXXX
10

JP2012142033A

(P Hotelling Steve, 2012)
(Original Assignee) Apple Inc; アップル インコーポレイテッド     多機能ハンドヘルド装置 receiving touch マルチタッチ

first mode 動作モード

information items ハウジング

display screen 制御器

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
XXXXXXX
11

US20120266079A1

(Mark Lee, 2012)
(Original Assignee) Splashtop Inc     

(Current Assignee)
Splashtop Inc
Usability of cross-device user interfaces holding pattern user interface element

electronic device status display panel determine location

screen mode application window

first set defined area

area comprising vertices user intent

35 U.S.C. 103(a)

35 U.S.C. 102(e)

35 U.S.C. 102(b)
teaches the invention substantially as claimed and described in claim…

describes a system that displays a control panel of a secondary devices connected through a network…

teaches a client computer system including a method of operation as in claim…

teaches the transformation of the images from the camera into viewable images for the portable display…
XXXXX
12

KR20120092036A

(서준규, 2012)
(Original Assignee) 삼성전자주식회사     터치 스크린 디스플레이를 구비한 휴대 기기 및 그 제어 방법 electronic device 영역들

thermal sensors 이미지

35 U.S.C. 103(a)

35 U.S.C. 102(a)
discloses a comparable touch input device which has been improved in the same way as the claimed invention…

discloses or at least renders the entirety of this further limitation…

teaches wherein the data interface comprises a USB data interface a…

teaches displaying at least one identifier identifying an application that is designated to be executed and displayed…
XXXXXXXXX
13

US20120081319A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Modifying the display stack upon device open second set display objects

first portion first portion

operating system status bar comprises one

screen mode one display

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXX
14

US20120084681A1

(Ron Cassar, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Application launch user input first touch screen

second portion, usage frequency second portion

first portion first portion

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXX
15

US20120084725A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Managing hierarchically related windows in a single display first set different application

operating system status bar play mode

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XX
16

WO2012044809A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     Repositioning windows in the pop-up window touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set, one screen

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
17

WO2012044743A2

(Alexander De Paz, 2012)
(Original Assignee) Imerj LLC     Gravity drop touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
18

WO2012044775A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     Keyboard filling one screen or spanning multiple screens of a multiple screen device first mode, second mode operating modes

second portion, usage frequency second portion

first portion first portion

receiving touch display area

first set, second set first one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXX
19

WO2012044805A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     Method and system for performing copy-paste operations on a device via user gestures first set different application

screen mode application window

second mode, second set following steps

receiving touch display area, user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXX
20

WO2012044801A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     Application display transitions between single and multiple displays receiving touch display area

second set, display system second set

first set first set

third set third set

holding pattern same time

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXX
21

WO2012044780A1

(Rodney Wayne Schrock, 2012)
(Original Assignee) Imerj LLC     Single- screen view in response to rotation response instruction fourth instructions

touchscreen layer, touchscreen display computing system

first mode first direction

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
22

WO2012044839A2

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     Smartpad orientation electronic device status display panel touch screen

touchscreen display full screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXX
23

WO2012044739A2

(Alexander De Paz, 2012)
(Original Assignee) Imerj LLC     Rotation gravity drop touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
24

US20120084676A1

(Alexander de Paz, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Dual screen application visual indicator touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

user input user input

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
25

US20120084698A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen with keyboard first set different application

electronic device status display panel touch screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
26

US20120084701A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Keyboard maximization first portion closed state

receiving touch display area

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XX
27

US20120081313A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen desktop first set different application

electronic device status display panel touch screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
28

US20120084675A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Annunciator drawer touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

user input user input

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
29

US20120084674A1

(John Steven Visosky, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Allowing multiple orientations in dual screen view touchscreen layer, touchscreen display computing system, main display

display screen display state

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXXX
30

US20120081317A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Method and system for performing copy-paste operations on a device via user gestures first set different application

screen mode application window

second mode, second set following steps

receiving touch display area, user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXX
31

US20120081316A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Off-screen gesture dismissable keyboard second portion, usage frequency second portion

display screen, screen mode display output

touchscreen area second areas

user input user input

receiving touch first area

first set first set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXX
32

US20120081399A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Visible card stack first set different application

electronic device status display panel touch screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
33

US20120081398A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen first set different application

second portion, usage frequency second portion

first portion first portion, n storage

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXX
34

US20120081315A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Keyboard spanning multiple screens first portion first portion

receiving touch display area

first set, second set first one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XX
35

US20120081854A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen desktop first set different application

electronic device status display panel touch screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
36

US20120081403A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen first set different application

second portion, usage frequency second portion

first portion first portion, n storage

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXX
37

US20120081400A1

(Rodney Wayne Schrock, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Dual-screen view in response to rotation touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
38

US20120081314A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen desktop first set different application

electronic device status display panel touch screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
39

US20120084710A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Repositioning windows in the pop-up window touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set, one screen

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
40

US20120084680A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Gesture capture for manipulation of presentations on one or more device displays information items second determining

second mode, second set following steps

receiving touch user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXX
41

US20120084694A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Method and system for performing drag and drop operations on a device via user gestures first set different application

screen mode application window

second mode, second set following steps

receiving touch display area, user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXX
42

US20120084700A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Keyboard dismissed on closure of device first portion closed state

receiving touch display area

user input user input

first set, second set first one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXX
43

US20120084721A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Window stack modification in response to orientation change touchscreen layer, touchscreen display computing system

receiving touch display area

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXX
44

US20120081270A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Dual screen application behaviour touchscreen layer, touchscreen display computing system

receiving touch display area

screen mode screen mode

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXX
45

US20120081271A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Application display transitions between single and multiple displays receiving touch display area

second set, display system second set

first set first set

third set third set

holding pattern same time

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXX
46

US20120081280A1

(Rodney Wayne Schrock, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Single-screen view in response to rotation response instruction fourth instructions

touchscreen layer, touchscreen display computing system

first mode first direction

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXXXXXXXXXX
47

US20120081289A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Keyboard filling one screen or spanning multiple screens of a multiple screen device first mode, second mode operating modes

second portion, usage frequency second portion

first portion first portion

receiving touch display area

first set, second set first one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXX
48

US20120081292A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Desktop reveal first set different application

second portion, usage frequency second portion

first portion first portion, n storage

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXX
49

US20120081293A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Gravity drop rules and keyboard display on a multiple screen device user input first touch screen, user input

second mode, screen mode user selection

second portion, usage frequency second portion

first portion first portion

display screen touch screens

receiving touch display area

first set, second set first one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXXXX
50

US20120081311A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad orientation electronic device status display panel touch screen

touchscreen display full screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXX
51

US20120081312A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Smartpad split screen electronic device status display panel touch screen

touchscreen display full screen

first portion n storage

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the second display panel comprises a gyroscope sensor examples of hardware sensors include gyroscopes…

teaches a screen change method for a device having a plurality of touch screens the screen change method comprising…

teaches wherein the first tap gesture is performed while the short press gesture is still being held…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…
XXXXXXX
52

WO2012044545A2

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj, Llc     Gesture controlled screen repositioning for one or more displays user input first touch screen

first mode first direction

second portion, usage frequency second portion

operating system status bar single display

first portion first portion

screen mode screen mode

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXX
53

WO2012044516A2

(Alex De Paz, 2012)
(Original Assignee) Imerj, Llc     Multi-screen user interface with orientation based control first mode first direction

first set different one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXX
54

WO2012044515A2

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj, Llc     Gesture based application management second portion, usage frequency second portion

first portion first portion

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXX
55

WO2012044510A2

(Paul E. Reeves, 2012)
(Original Assignee) Imerj, Llc     User interface with independent drawer control display system first location

display screen, screen mode status bar

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
56

US20120081309A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Displayed image transition indicator receiving touch following operation

second mode, second set following steps

user input user input

usage frequency red color

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXX
57

US20120081310A1

(Rodney W. Schrock, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Pinch gesture to swap windows receiving touch following operation

screen mode touch display

holding pattern same time

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XX
58

US20120084714A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Window stack models for multi-screen displays operating system status bar comprises one

receiving touch display area

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
X
59

US20120081397A1

(Alexander de Paz, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Rotation gravity drop touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
60

US20120081267A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Desktop reveal expansion touchscreen layer, touchscreen display computing system

first portion closed state

receiving touch display area

second set, display system second set

user input user input

first set first set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXX
61

US20120081268A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Launching applications into revealed desktop touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

user input user input

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
62

US20120081269A1

(Alex de Paz, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Gravity drop touchscreen layer, touchscreen display computing system

receiving touch display area

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
63

US20120081302A1

(Martin Gimpl, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Multi-screen display control touchscreen layer, touchscreen display computing system

second set, display system second set

first set first set

third set third set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
64

US20120081304A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Hardware buttons activated based on focus receiving touch to display information, display area

second portion second output

electronic device status display panel touch screen

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXX
65

US20120081305A1

(Rodney W. Schrock, 2012)
(Original Assignee) Imerj LLC     

(Current Assignee)
Z124
Swipeable key line first mode first direction

second portion, usage frequency second portion

first portion first portion

display screen touch screens

first set first set

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXX
66

US20130021289A1

(Wei Chen, 2013)
(Original Assignee) Apple Inc     

(Current Assignee)
Apple Inc
Touch sensitive displays electronic device electronic device

display system, electronic device status display panel control signals

holding pattern front surface

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches the low impedance segment has zero ohms for purposes of circuit analysis when analyzing a circuit containing…

teaches using a black material to form the insulator as a way of preventing reflection of unwanted light off of…

discloses wherein the protecting layer spacer is provided as DLC or as silicon nitride thus exemplifying recognized…

teaches a similar semiconductor device wherein a flexible substrate is formed from metal or plastic…
XXXXXXXXXXXXXXXXXX
67

CN102713822A

(池田洋一, 2012)
(Original Assignee) Panasonic Corp     

(Current Assignee)
Panasonic Corp
信息输入装置、信息输入方法以及程序 information items 的位置

touchscreen area 第一触

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a flexible display that displays content from applications that can be manipulated in response to a touch…

discloses by generating responsive to the motion information a random…
XX
68

US20120293456A1

(Yoichi Ikeda, 2012)
(Original Assignee) Panasonic Corp     

(Current Assignee)
Panasonic Intellectual Property Corp of America
Information input apparatus, information input method, and program s hand threshold value

information items operation type

operating system status bar operation time

first set, second set first contact

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a flexible display that displays content from applications that can be manipulated in response to a touch…

discloses by generating responsive to the motion information a random…
XXXXX
69

US20120154294A1

(Kenneth P. Hinckley, 2012)
(Original Assignee) Microsoft Corp     

(Current Assignee)
Microsoft Technology Licensing LLC
Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device operating system status bar comprises one

thermal sensors, receiving touch light sensor

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

discloses by generating responsive to the motion information a random…
XX
70

US20120084697A1

(Paul E. Reeves, 2012)
(Original Assignee) Flextronics Innovative Development LLC     

(Current Assignee)
Z124
User interface with independent drawer control display system first location

display screen, screen mode status bar

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXXXXXXXXXXXX
71

US20120084735A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Flextronics Innovative Development LLC     

(Current Assignee)
Z124
Gesture controls for multi-screen user interface touchscreen layer, touchscreen area external display

operating system status bar single display

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXX
72

US20120084738A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Flextronics Innovative Development LLC     

(Current Assignee)
Z124
User interface with stacked application management second portion, usage frequency second portion

first portion first portion

operating system status bar comprises one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXX
73

US20120084736A1

(Sanjiv Sirpal, 2012)
(Original Assignee) Flextronics Innovative Development LLC     

(Current Assignee)
Z124
Gesture controlled screen repositioning for one or more displays user input first touch screen

first mode first direction

second portion, usage frequency second portion

operating system status bar single display

first portion first portion

screen mode screen mode

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXXXXXX
74

US20120081277A1

(Alex de Paz, 2012)
(Original Assignee) Flextronics Innovative Development LLC     

(Current Assignee)
Z124
Multi-screen user interface with orientation based control first mode first direction

first set different one

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches wherein the data interface comprises a USB data interface a…

teaches that objects displayed on the dockable display device can be arranged on the display panel in a variety of…

teaches parent and child screens on a first touch screen fig…

discloses a mobile data path selection wherein the configuring of functionality optimizes available bandwidth see…
XXX
75

CN101996043A

(金龙植, 2011)
(Original Assignee) FANTAI Co Ltd     

(Current Assignee)
Pan Thai Co.,Ltd.
执行移动终端的热键功能的装置和方法 electronic device, handheld interactive electronic device 功能的方法

virtual bezel region function 选择功能

heat signature 第三热

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
discloses that the pen commands include a save command to save items on the display abstract…

teaches handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or…

teaches wherein the determining of whether the touch input matched to the touch input event is valid comprises…

discloses positioning system indicate a location change of the asset and transmit alert messages…
XXXXXXXXX
76

CN102667662A

(罗尔·弗特加尔, 2012)
(Original Assignee) 罗尔·弗特加尔; 贾斯廷·利; 伊夫斯·比哈尔; 皮查亚·帕通古尔     柔性显示器的交互技术 electronic device 包含下列步骤

second mode 数字上

usage frequency 的使用

receiving touch 的接收

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses that a folding of a flexible display may cause objects displayed in the document to be moved to the center of…

discloses a device having a touch screen in which inputs are made with a bend sensor and a position sensor…

teaches displaying the different perspective of the object comprises displaying the object as a shape that corresponds…

teaches it the display unit is convexly bent along with the screen the controller moves an object to an edge of the…
XXXXXXXXX
77

KR20120093148A

(로엘 버티갈, 2012)
(Original Assignee) 로엘 버티갈; 이브 베하; 저스틴 이; 피차야 푸톤굴     플렉시블 디스플레이를 위한 상호작용 기법 s thermal sensors 다이오드

display system 베이스

electronic device status display panel 와이핑

receiving touch 스크린

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses that a folding of a flexible display may cause objects displayed in the document to be moved to the center of…

discloses a device having a touch screen in which inputs are made with a bend sensor and a position sensor…

teaches displaying the different perspective of the object comprises displaying the object as a shape that corresponds…

teaches it the display unit is convexly bent along with the screen the controller moves an object to an edge of the…
XXXXXXXXXXXXXX
78

US20110261058A1

(Tong Luo, 2011)
(Original Assignee) Tong Luo     

(Current Assignee)
HANDSCAPE Inc A DELAWARE Corp
Method for user input from the back panel of a handheld computerized device area comprising vertices imaginary plane

screen mode one display

s hand s hand

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…

discloses manually setting the speed and direction of animations see paragraph…

teaches a motion sensing unit configured to sense a user motion…
XX
79

US20100141605A1

(Tae Young Kang, 2010)
(Original Assignee) Samsung Electronics Co Ltd     

(Current Assignee)
Samsung Electronics Co Ltd
Flexible display device and data displaying method thereof receiving touch flexible display, first area

first mode first direction

first portion second corner, first corner

35 U.S.C. 103(a)

35 U.S.C. 102(a)

35 U.S.C. 102(b)
discloses touching a displayed object such as the content shown in FIGS…

discloses a flexible display that displays content from applications that can be manipulated in response to a touch…

teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

teaches wherein the predetermined function is defined according to a user selection icons functions can be displayed…
XXX
80

US20100066677A1

(Peter Garrett, 2010)
(Original Assignee) Legacy IP LLC     

(Current Assignee)
Edge Mobile Payments LLC
Computer Peripheral Device Used for Communication and as a Pointing Device touchscreen layer, touchscreen display computing system

touchscreen area digital camera

operating system status bar, status bar visibility host computer

35 U.S.C. 103(a)

35 U.S.C. 102(e)

35 U.S.C. 102(b)
teaches a smart phone which inherently has a touch screen to control operation based on touch such as dragging bars…

teaches that the lens assembly can be a sliding assembly p…

teaches a mobile phone and camera thereof title and further teaches the lenses of the wheel are positioned such that…

teaches that mobile units can be identified by either an IP address or a media access address paragraph…
XXXXXX
81

CN101729658A

(金钟焕, 2010)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
使用投影仪模块的节电移动终端及其方法 response instruction, s response instruction 响应于检测

receiving touch 触摸屏处, 包括接收

35 U.S.C. 103(a)

35 U.S.C. 102(a)
teaches wherein the determining of whether the touch input matched to the touch input event is valid comprises…

teaches a projection system that detects a display size screen size and transmits light within the detected size of…

discloses motion trigger by push button on handheld device to wake up handheld device and in response to detecting a…

discloses wherein a function is determined according to the motion direction of the mobile terminal if the hot key…
X
82

EP2178274A1

(Jong Hwan Kim, 2010)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
Power saving mobile terminal using projector module and method for same operating system status bar comprises one

user input user input

electronic device status display panel, information items key input

35 U.S.C. 103(a)

35 U.S.C. 102(a)
teaches wherein the determining of whether the touch input matched to the touch input event is valid comprises…

teaches a projection system that detects a display size screen size and transmits light within the detected size of…

discloses motion trigger by push button on handheld device to wake up handheld device and in response to detecting a…

discloses wherein a function is determined according to the motion direction of the mobile terminal if the hot key…
XXXXXXXX
83

US20100099457A1

(Jong Hwan Kim, 2010)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
Mobile communication terminal and power saving method thereof operating system status bar comprises one

user input user input

electronic device status display panel, information items key input

35 U.S.C. 103(a)

35 U.S.C. 102(a)
teaches wherein the determining of whether the touch input matched to the touch input event is valid comprises…

teaches a projection system that detects a display size screen size and transmits light within the detected size of…

discloses motion trigger by push button on handheld device to wake up handheld device and in response to detecting a…

discloses wherein a function is determined according to the motion direction of the mobile terminal if the hot key…
XXXXXXXX
84

US20100045705A1

(Roel Vertegaal, 2010)
(Original Assignee) Roel Vertegaal; Justin Lee; Behar Yves; Pichaya Puttorngul     Interaction techniques for flexible displays second mode, second set following steps, said input

information items placing one

add one RFID tags

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses that a folding of a flexible display may cause objects displayed in the document to be moved to the center of…

discloses a device having a touch screen in which inputs are made with a bend sensor and a position sensor…

teaches displaying the different perspective of the object comprises displaying the object as a shape that corresponds…

teaches it the display unit is convexly bent along with the screen the controller moves an object to an edge of the…
XXXXX
85

US20100007632A1

(Shunpei Yamazaki, 2010)
(Original Assignee) Semiconductor Energy Laboratory Co Ltd     

(Current Assignee)
Semiconductor Energy Laboratory Co Ltd
Semiconductor device holding pattern liquid crystal element

first portion second pixel

usage frequency red color

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
discloses more commonly used optical sensing technique is based on detecting and analyzing variations in light…

discloses paper currency recognition method according to claim…

teaches wherein the pixel electrode is located between the drive electrode and the detection electrode…

discloses a mobile device wherein a soft keyboard for text input being divided into alphanumeric input section as shown…
XXXX
86

EP2161645A2

(Jong Hwan Kim, 2010)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
Portable terminal and driving method of the same display screen thumbnail image

operating system status bar comprises one

electronic device status display panel touch screen

second mode said input

information items input data

35 U.S.C. 103(a)

35 U.S.C. 102(e)

35 U.S.C. 102(b)
discloses wherein a function is determined according to the motion direction of the mobile terminal if the hot key…

teaches handheld electronic device includes cross functional physical buttons fig…

describes about transmitting the encoded input gesture message to a recipient mobile device…

teaches a method of processing a user input by using a touch screen the method comprising sensing whether a touch…
XXXXXXXX
87

CN101655769A

(金钟焕, 2010)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
便携式终端及其驱动方法 usage frequency 包括使用

virtual bezel, virtual bezel region 且显示

35 U.S.C. 103(a)

35 U.S.C. 102(e)

35 U.S.C. 102(b)
discloses wherein a function is determined according to the motion direction of the mobile terminal if the hot key…

teaches handheld electronic device includes cross functional physical buttons fig…

describes about transmitting the encoded input gesture message to a recipient mobile device…

teaches a method of processing a user input by using a touch screen the method comprising sensing whether a touch…
XXXXXXXXXXXXXXX
88

US20090259969A1

(Matt Pallakoff, 2009)
(Original Assignee) Matt Pallakoff     Multimedia client interface devices and methods touchscreen display contact point

electronic device, user input other port, user input

first set first type

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…

teaches a motion sensing unit configured to sense a user motion…

discloses the speed of a swipe could be used to determine how a device responds see paragraph…
XXXXXXXXX
89

KR20100065418A

(강태영, 2010)
(Original Assignee) 삼성전자주식회사     가요성 표시부를 가지는 단말기 및 그의 데이터 표시 방법 electronic device 영역들

display system 중앙을

35 U.S.C. 103(a)

35 U.S.C. 102(a)

35 U.S.C. 102(b)
teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

discloses touching a displayed object such as the content shown in FIGS…

discloses a flexible display that displays content from applications that can be manipulated in response to a touch…

teaches wherein the predetermined function is defined according to a user selection icons functions can be displayed…
XXXXXXXXXXXXXXXXXX
90

US20100030549A1

(Michael M. Lee, 2010)
(Original Assignee) Apple Inc     

(Current Assignee)
Apple Inc
Mobile device having human language translation capability with positional feedback electronic device electronic device

first mode ninety degrees

second mode, screen mode second modes

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(a)

35 U.S.C. 102(b)
teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

discloses the plurality of menu items iconsmenus are determined based on at least a frequency at which the user accesses…

teaches wherein the determining of whether the touch input matched to the touch input event is valid comprises…

teaches where the predefined gesture is user determined see…
XXXXXXXXXX
91

CN101431563A

(吴汉奎, 2009)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
移动终端 second portion 的外侧

add one 包括一

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a base touch input device upon which the claimed invention is an improvement…

teaches wherein the plurality of force sensors detect finger movement through a protective element…

teaches a touchcontrolled electronic apparatus comprising a touch screen…

discloses a touch screen that is configurable to selectively display various inputtable signal sets or other visual…
XXX
92

US20090122026A1

(Han-Gyu OH, 2009)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
Mobile terminal receiving touch to display information, user inputs

user input, user input area opposite surface

display screen non-edge portion

electronic device status display panel touch screen

holding pattern first region

touchscreen layer inner side

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a base touch input device upon which the claimed invention is an improvement…

teaches wherein the plurality of force sensors detect finger movement through a protective element…

teaches a touchcontrolled electronic apparatus comprising a touch screen…

discloses a touch screen that is configurable to selectively display various inputtable signal sets or other visual…
XXXXXXXXXX
93

US20090254855A1

(Martin Kretz, 2009)
(Original Assignee) Sony Mobile Communications AB     

(Current Assignee)
Sony Mobile Communications AB
Communication terminals with superimposed user interface touchscreen area user input device

s hand s hand

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses generating a random based on the measurements received by a motion sensor…

teaches wherein the plurality of force sensors detect finger movement through a protective element…

teaches the method comprising zooming out or zooming in depending on the change of distance of the hand from the…

discloses a comparable touch padpanel device that has been improved in the same way as the claimed invention…
XX
94

WO2009071336A2

(Vesa Luiro, 2009)
(Original Assignee) Nokia Corporation     Method for using accelerometer detected imagined key press user input, user input area receiving user input

operating system status bar comprises one

heat signature time frame

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
teaches the method comprising zooming out or zooming in depending on the change of distance of the hand from the…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…

teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

discloses manually setting the speed and direction of animations see paragraph…
XXXXXX
95

EP2058729A1

(Han-Gyu Oh, 2009)
(Original Assignee) LG Electronics Inc     

(Current Assignee)
LG Electronics Inc
Mobile terminal holding pattern first region

touchscreen layer inner side

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a base touch input device upon which the claimed invention is an improvement…

teaches wherein the plurality of force sensors detect finger movement through a protective element…

teaches a touchcontrolled electronic apparatus comprising a touch screen…

discloses a touch screen that is configurable to selectively display various inputtable signal sets or other visual…
XXXX
96

US20080088602A1

(Steven Hotelling, 2008)
(Original Assignee) Apple Inc     

(Current Assignee)
Apple Inc
Multi-functional hand-held device first set current touch

second set including one

screen mode screen mode

touchscreen display full screen

receiving touch user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
XXXX
97

WO2008005505A2

(Steve Porter Hotelling, 2008)
(Original Assignee) Apple Inc.     Capacitance sensing electrode with integrated i/o device screen mode, s response instruction second communication lines

first portion first communication

heat signature same communication

electronic device electronic device

first mode, first set angular position

area comprising vertices output mechanism

electronic device status display panel touch screen

display screen, add one touch pad

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses by generating responsive to the motion information a random…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…
XXXXXXXXXXXXX
98

CN101432677A

(N·金, 2009)
(Original Assignee) Apple Computer Inc     

(Current Assignee)
Apple Inc
具有显示器和用于用户界面及控制的周围触摸敏感边框的电子设备 first set 的多个电

information items 的位置

virtual bezel display screen 具有周

add one 相比较, 包括一

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
XXXXX
99

US20080007533A1

(Steve P. Hotelling, 2008)
(Original Assignee) Apple Computer Inc     

(Current Assignee)
Apple Inc
Capacitance sensing electrode with integrated I/O mechanism screen mode, s response instruction second communication lines

first portion first communication

heat signature same communication

electronic device electronic device

first mode, first set angular position

area comprising vertices output mechanism

electronic device status display panel touch screen

display screen, add one touch pad

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses by generating responsive to the motion information a random…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…
XXXXXXXXXXXXX
100

US20060238517A1

(Nick King, 2006)
(Original Assignee) Apple Computer Inc     

(Current Assignee)
Apple Inc
Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control holding pattern surface position

touchscreen layer designating one

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

teaches the contact body contains a conductive rubber member see…

discloses by generating responsive to the motion information a random…
XXXX
101

US20070291008A1

(Daniel Wigdor, 2007)
(Original Assignee) Mitsubishi Electric Research Laboratories Inc     

(Current Assignee)
Mitsubishi Electric Research Laboratories Inc
Inverted direct touch sensitive input devices status bar visibility display images, display unit

first set current touch

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

discloses by generating responsive to the motion information a random…
XX
102

WO2006094308A2

(Steve P. Hotelling, 2006)
(Original Assignee) Apple Computer, Inc.     Multi-functional hand-held device first set current touch

second set including one

electronic device status display panel touch screen

touchscreen display full screen

receiving touch user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
XXXXXX
103

KR20070110114A

(스티브 피. 호텔링, 2007)
(Original Assignee) 애플 인크.     다기능 휴대용 장치 sensitive display, sensitive display screen physical buttons, 컨트롤들

s response instruction, response instruction 입력들을, indication

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
X
104

US20060197753A1

(Steven Hotelling, 2006)
(Original Assignee) Apple Computer Inc     

(Current Assignee)
Apple Inc
Multi-functional hand-held device first set current touch

second set including one

electronic device status display panel touch screen

touchscreen display full screen

receiving touch user inputs

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…

discloses by generating responsive to the motion information a random…
XXXXXX
105

US8018440B2

(Reed L. Townsend, 2011)
(Original Assignee) Microsoft Corp     

(Current Assignee)
Microsoft Technology Licensing LLC
Unintentional touch rejection touchscreen layer, touchscreen display first character

usage frequency low probability

receiving touch first area

holding pattern same time

s hand s hand

XXXXX
106

WO2005008444A2

(Matt Pallakoff, 2005)
(Original Assignee) Matt Pallakoff     System and method for a portbale multimedia client electronic device, electronic device status display panel electronic device, sensor means

user input area service provider

first mode first direction

first set second function

operating system status bar display control

second set remote device

s hand s hand

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…

teaches a motion sensing unit configured to sense a user motion…

discloses the speed of a swipe could be used to determine how a device responds see paragraph…
XXXXXXXXXXX
107

US20050012723A1

(Matt Pallakoff, 2005)
(Original Assignee) MOVE MOBILE SYSTEMS Inc     

(Current Assignee)
MOVE MOBILE SYSTEMS Inc
System and method for a portable multimedia client electronic device, electronic device status display panel electronic device, sensor means

user input area service provider

first mode first direction

first set second function

operating system status bar display control

second set remote device

s hand s hand

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(a)
teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…

teaches a motion sensing unit configured to sense a user motion…

discloses the speed of a swipe could be used to determine how a device responds see paragraph…
XXXXXXXXXXX
108

US20030221876A1

(Paul Doczy, 2003)
(Original Assignee) Hewlett Packard Development Co LP     

(Current Assignee)
Hewlett Packard Development Co LP
Instrument-activated sub-surface computer buttons and system and method incorporating same electronic device, first mode triggering signal

display screen display screen

first set different one

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a system that configures a portable device at startup based on the detected expansion unit ie docking stations…

teaches that upon the device docking with the smartpad an orientation of the smartpad is detected see paragraph…

discloses a method wherein detecting the environmental information includes detecting whether specified hardware is…

teaches this concept to provide the advantage of having the images displayed on the tablet be correctly aligned with…
XXXXXXXXXXXX
109

US5606346A

(Tsutomu Kai, 1997)
(Original Assignee) Panasonic Corp     

(Current Assignee)
Panasonic Corp
Coordinate input device first mode oscillating frequency

screen mode control means

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses by generating responsive to the motion information a random…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…
XXX
110

US20120266072A1

(Jeyhan Karaoguz, 2012)
(Original Assignee) Broadcom Corp     

(Current Assignee)
Avago Technologies International Sales Pte Ltd
Method And System For A Digital Diary System operating system status bar wireless communication module

user input, user input area receiving user input, capture module

heat signature respective user

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(e)
teaches a wireless communication device wherein the power control circuitry enables the information stored in the…

discloses an apparatus comprising a mobile computing device to support cellular voice communication wireless data…

teaches creating an electronic journal from multiple sources of data…

teaches a device including a PDA and a cell phone which can be set so that the cell phone is turned off in certain…
XXXXXXXX
111

EP2530677A2

(Sung-Jae Hwang, 2012)
(Original Assignee) Samsung Electronics Co Ltd     

(Current Assignee)
Samsung Electronics Co Ltd
Method and apparatus for controlling a display of multimedia content using a timeline-based interface operating system status bar display control

electronic device status display panel touch screen

first set time line

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches wherein the apparatus is configured to in response to receiving the input by the first object in the multi…

discloses gestures for controlling manipulating and editing of media files using touch sensitive devices…

discloses a proximity sensitive device that detects gesture inputs to control edit and manipulate media files…

discloses all that is explained above there fails to be specific disclosure of a double click is determined by…
XXXX
112

US20120229403A1

(Nicolas De Jong, 2012)
(Original Assignee) Koninklijke Philips NV     

(Current Assignee)
Koninklijke Philips NV
Image display that moves physical objects and causes tactile sensation touchscreen area determined direction

screen mode display image

X
113

US20120194461A1

(Seung E. Lim, 2012)
(Original Assignee) Lester F. Ludwig     

(Current Assignee)
NRI R&d Patent Licensing LLC
Advanced touch control of interactive map viewing via finger angle using a high dimensional touchpad (hdtp) touch user interface touchscreen layer, touchscreen area right position

operating system status bar measured data

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches where a gesture of a right hand edge sweep going from the right to the left of the touch region…

teaches a touch device where if the touch area size corresponds to a hand edge touch area size and the touch…

discloses converting inputted analog waveforms into respective digital representations using a binary search…

discloses that a multitouch surface apparatus for sensing diverse configurations and activities of fingers and palms of…
XXXX
114

US20120194462A1

(Seung E. Lim, 2012)
(Original Assignee) Lester F. Ludwig     

(Current Assignee)
NRI R&d Patent Licensing LLC
Advanced touch control of interactive immersive imaging applications via finger angle using a high dimensional touchpad (hdtp) touch user interface touchscreen layer, touchscreen area right position

operating system status bar measured data

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches where a gesture of a right hand edge sweep going from the right to the left of the touch region…

teaches a touch device where if the touch area size corresponds to a hand edge touch area size and the touch…

discloses converting inputted analog waveforms into respective digital representations using a binary search…

discloses that a multitouch surface apparatus for sensing diverse configurations and activities of fingers and palms of…
XXXX
115

US20130002610A1

(Hsien-Lung Ho, 2013)
(Original Assignee) Hon Hai Precision Industry Co Ltd     

(Current Assignee)
Hon Hai Precision Industry Co Ltd
Touch sensitive display device second portion, usage frequency second portion

first portion first portion

screen mode touch display

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses generating a random based on the measurements received by a motion sensor…

teaches the projected capacitive touch sensing system of claim…

discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses a portable handheld device capable of wirelessly communicating with a remote processor via a network the…
XXXX
116

US20120223900A1

(Motoya JIYAMA, 2012)
(Original Assignee) Alps Electric Co Ltd; Denso Ten Ltd     

(Current Assignee)
Denso Ten Ltd ; Alps Alpine Co Ltd
Display device display screen, screen mode touch panel

electronic device on state

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches a base touch padpanel device upon which the claimed invention is an improvement…

teaches wherein upon a determination of the hovering finger condition the processing circuitry is further operable to…

teaches touch sensitive element values are selected from touch sensors measured capacitance…

discloses the pointer detection apparatus according to claim…
XXXXXXXXXXX
117

US20120105370A1

(Chad B. Moore, 2012)
(Original Assignee) Nupix LLC     

(Current Assignee)
Nupix LLC
Electroded Sheet for a Multitude of Products s hand wireless communication link

touchscreen layer antiglare surface, panel display

receiving touch flexible display

electronic device status display panel touch screen

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses wherein each of the electrodes of said input array of electrodes and of said output array of electrodes…

teaches wherein the pixel electrode is located between the drive electrode and the detection electrode…

discloses a metal conductor overlaying a transparent indium tin oxide conductor…

discloses of a touch panel equipped display device according to claim…
XXXXX
118

US20130063364A1

(Stephen C. Moore, 2013)
(Original Assignee) Motorola Mobility LLC     

(Current Assignee)
Google Technology Holdings LLC
Using pressure differences with a touch-sensitive display screen electronic device personal communications

second set predefined criterion

thermal sensors, s thermal sensors square root

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses a system comprising a power management system and wherein control logic generates at least one command that is…

discloses creating a playlist wherein a command to play the playlist causes the list of songs to be played by a…

discloses this limitation in that in response to the detection of a rightward finger gesture imparted within the display…
XXXXXXXXX
119

US20130063389A1

(Stephen C. Moore, 2013)
(Original Assignee) Motorola Mobility LLC     

(Current Assignee)
Google Technology Holdings LLC
Using pressure differences with a touch-sensitive display screen electronic device personal communications

second set predefined criterion

thermal sensors, s thermal sensors square root

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches the use of touchscreen devices on various computing devices…

discloses maintaining the expansion of the expanded area in a last sensed location for a predetermined amount of time…

discloses wherein recognizing the gesture comprises determining a first position of the presence of the first conductive…

teaches for example that gestures can be created to detect and effect a user command to resize a window scroll a…
XXXXXXXXX
120

WO2012021417A1

(Anthony T. Blow, 2012)
(Original Assignee) Qualcomm Incorporated     Method and system for adjusting display content thermal sensors sensed location, more sensor

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses that it is known in the art to provide a glowing appearance to an icon when an icon is being interacted with…

teaches the information processing apparatus wherein the processor determines based on an output from the user…

teaches a method of single hand user interface it would have been obvious to one of ordinary skill in the art at the…

teaches the gesture comprises a rolling motion of the thumb…
XXXXX
121

JP2013030050A

(Tomoaki Matsuki, 2013)
(Original Assignee) Kddi Corp; Kddi株式会社     スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム first portion モニタ

receiving touch パネル

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

teaches first and second tactile sensations for button press and release events wherein the second tactile sensation…

discloses an electronic input device analogous in art with that of…
XX
122

US20120044172A1

(Yoshihito Ohki, 2012)
(Original Assignee) Sony Corp     

(Current Assignee)
Sony Corp
Information processing apparatus, program, and operation control method operating system status bar operation control

s hand threshold value

status bar visibility display unit

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches contact lists and also teaches storing photo images and using icon it would have been obvious to one of…

discloses the information communication terminal according to claim…

discloses detection of gestures with touch sensitive devices for manipulating displayed data see abstract wherein the…

discloses having a facial recognition software that will help the user to sort and collect images from the stored…
XX
123

US20120023450A1

(Ryuichiro Noto, 2012)
(Original Assignee) Sony Corp     

(Current Assignee)
Sony Corp
User interface device and user interface method screen mode, display screen touch panel, display screen

receiving touch display area

XXXXXX
124

US20120306765A1

(Stephen C. Moore, 2012)
(Original Assignee) Motorola Mobility LLC     

(Current Assignee)
Google Technology Holdings LLC
Using pressure differences with a touch-sensitive display screen electronic device personal communications

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a base touch input panel upon which the claimed invention is an improvement…

discloses an apparatus capable of performing the method of claim…

discloses wherein selfcapacitance sensors also suffer from stray or parasitic capacitance and would require a means to…

discloses converting inputted analog waveforms into respective digital representations using a binary search…
XXXXXXXXX
125

US20120306766A1

(Stephen C. Moore, 2012)
(Original Assignee) Motorola Mobility LLC     

(Current Assignee)
Google Technology Holdings LLC
Using pressure differences with a touch-sensitive display screen electronic device personal communications

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses generating a random based on the measurements received by a motion sensor…

discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

teaches wherein the plurality of force sensors detect finger movement through a protective element…

discloses a portable touch screen input device analogous in art with that of…
XXXXXXXXX
126

US20120254747A1

(Radu Catalin Bocirnea, 2012)
(Original Assignee) McKesson Financial Holdings ULC     

(Current Assignee)
Change Healthcare Holdings LLC
Methods, apparatuses and computer program products for generating regions of interest using gestures via a user interface first portion more regions

second set generate one

holding pattern first region

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses this limitation in that volume adjustment slider icon…

teaches the sensing unit senses a bending degree of the display unit and wherein the controller changes a moving speed…

discloses wherein the manipulation instructions comprise pinching finger movements for zoom instructions see paragraphs…

discloses converting inputted analog waveforms into respective digital representations using a binary search…
XXXX
127

WO2011126893A2

(Brent Keeth, 2011)
(Original Assignee) Micron Technology, Inc.     Apparatuses enabling concurrent communication between an interface die and a plurality of dice stacks, interleaved conductive paths in stacked devices, and methods for forming and operating the same electronic device status display panel, user input command signal

heat signature third position

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a product comprising TSVs whereby a particular die or common wafer is chosen from a library of available…

teaches the substrate is a material selected from the group consisting of gallium arsenide indium phosphide silicon…

discloses solder bumps on a side of the dummy silicon die ie made of metal and electrically connected through silicon…

discloses wherein securing together the semiconductor chips so as to form a base structure comprises forming a base…
XXXXXXX
128

JP2012128825A

(Masaya Azuma, 2012)
(Original Assignee) Sharp Corp; シャープ株式会社     電子機器、表示制御方法、およびプログラム thermal sensors 前記センサ

display screen, screen mode タッチ

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses a touch screen device analogous in art with that of…

teaches the second detection region surrounding a periphery of the first detection region on all sides…

discloses a method comprising providing a device that includes at least one touch sensor…

teaches a system for automatically adapting a graphical user interface GUI of a device…
XXXXXXX
129

US20120240044A1

(William J. Johnson, 2012)
(Original Assignee) Johnson William J; Johnson Jason M     System and method for summoning user interface objects display system first location

area comprising vertices said object

second mode said input

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses highlighting the first notification and the second notification see…

discloses the dismissal gesture includes dragging the card off screen…

discloses a display control apparatus comprising a proximity detector to detect proximity of an indicator to a display…

teaches further comprising a data interface for communicating data received at the touchsensitive surface a touch…
XXXXXXXXXXXXXXXXX
130

EP2500898A1

(W. Garland Phillips, 2012)
(Original Assignee) Research in Motion Ltd     

(Current Assignee)
BlackBerry Ltd
System and method for foldable display touchscreen layer, touchscreen display computing system

display system display system

second set generate one

receiving touch display area

user input user input

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses a pager that detects light quantity received at an LCD and adjusts light intensity of the LCD according to the…

teaches a liquid crystal display that includes a liquid crystal display panel…

discloses hardware components located within the PDA and thus supported by the housing utilized to communicate with the…

discloses the phone to use control parameters to operate a timer function for enabling a calendar function for entering…
XXXXXXXXXXXXXXXX
131

US20130021295A1

(Tomohiro Kimura, 2013)
(Original Assignee) Sharp Corp     

(Current Assignee)
Sharp Corp
Display device with touch panel function screen mode, display screen crystal display device, projection area

second mode said input

touchscreen layer inner side

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses a display panel comprising a first substrate section…

teaches the touchscreen display device as recited above in claim…

teaches wherein the touching and dragging along the peeled edge performs a predetermined function as application is…

discloses wherein the bending input comprises at least one of bending the device and unbending the device…
XXXXXX
132

US8373675B2

(Jin Jeon, 2013)
(Original Assignee) Samsung Display Co Ltd     

(Current Assignee)
Samsung Display Co Ltd
Display panel, display apparatus having the same, and method thereof first portion sensing electrodes

first mode first direction

touchscreen area electric field

XXXX
133

US20120162087A1

(Chih Sheng Hou, 2012)
(Original Assignee) Universal Cement Corp     

(Current Assignee)
Universal Cement Corp
Cover glass button for display of mobile device information items said platform

display screen, screen mode touch panel

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches a touch display with a mask layer and an ink layer…

teaches controlling activationdeactivation of touch sensitive screen ie second input area as well as other component…

discloses wherein after sensing and registering a function is performed based off of the information the function may be…

teaches wherein the controller recognizes a particular user motion according to whether or not the variation of the…
XXXXXX
134

US20120032876A1

(Joseph Akwo Tabe, 2012)
(Original Assignee) Joseph Akwo Tabe     Mega communication and media apparatus configured to provide faster data transmission speed and to generate electrical energy electronic device status display panel said communication network, touch screen

first set telecommunications system

screen mode power management module, electronic book

thermal sensors electromagnetic signals

usage frequency spectral efficiency

touchscreen area, user input area photovoltaic array, electrical power

receiving touch electrical voltage, input apparatus

operating system status bar vertical movement

electronic device electronic device

display screen electronic wafer

holding pattern, user input discharge cycles, output port

second set structured data, remote device

add one receiving port, other node

first portion textile fibers, common node

display system rate signals

area comprising vertices said object

35 U.S.C. 103(a)

35 U.S.C. 102(e)
discloses a portable radio terminal device comprising a plurality of transmission antennas separately provided a…

discloses a rotation shaft provided in the connection portion connector…

teaches an erected printed circuit board that is farther in normal use from a user of the mobile terminal…

discloses that the proximity transmit power level is limited to a predetermined maximum level column…
XXXXXXXXXXXXXXXXXX
135

EP2447818A1

(Scott Peter Gammon, 2012)
(Original Assignee) Research in Motion Ltd     

(Current Assignee)
BlackBerry Ltd
Method and portable electronic device for presenting text electronic device electronic device

display screen display screen

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches the information processing apparatus according to claim…

discloses having a notification associated with certain icons for example email waiting indicatornotification…

discloses a method to detect the when the end of the list has been reached…

teaches a centrally located control feature which allows a user to pan between views eg in column…
XXXXXXXXXXX
136

CN102668522A

(M·奥克斯曼, 2012)
(Original Assignee) Nokia Oyj     

(Current Assignee)
Nokia Technologies Oy
包括滑动显示部分的装置 virtual bezel, virtual bezel region 手持电子设备

first set, second set 一组控制

usage frequency 的使用

35 U.S.C. 103(a)

35 U.S.C. 102(a)
teaches the portable electronic apparatus according to claim…

teaches a display device which utilizes icons in one display to show and control content in another display…

teaches in addition a cover may be provided to prevent the subdevice from being separated undesirably after it is…

discloses that the signal changes as the cursor movement accelerates and based upon the speed of said cursor…
XXXXXXXXXXXXXXX
137

JP2012073873A

(Koji Inami, 2012)
(Original Assignee) Nec Casio Mobile Communications Ltd; Necカシオモバイルコミュニケーションズ株式会社     情報処理装置および入力方法 virtual bezel 前記操作

display screen, screen mode タッチ

receiving touch パネル

XXXXXXXXXXXXXXXX
138

US20120244348A1

(Min Soo Park, 2012)
(Original Assignee) LG Chem Ltd     

(Current Assignee)
LG Chem Ltd
Touch panel user input, user input area containing nitrogen, molecular weight

screen mode acrylic acid

35 U.S.C. 103(a)

35 U.S.C. 102(b)
discloses a propylene homopolymer and propylene impact copolymer can be blended together and amount of impact copolymer…

discloses that the additives may be combined with the polymers for films such as polyethylene col…

discloses an embodiment where sealable multilayer film may contain an intermediate layer middle layer of the present…

teaches tape backing composition wherein the tape backing comprises polypropylene polymers…
XXXXX
139

US20120182249A1

(Yuko Endo, 2012)
(Original Assignee) Nissha Printing Co Ltd     

(Current Assignee)
Nissha Printing Co Ltd
Mount structure of touch input device having pressure sensitive sensor electronic device status display panel transparent window

second set, second portion shaped electrodes

display screen, screen mode touch panel

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches wherein the plurality of force sensors detect finger movement through a protective element…

discloses a comparable touch padpanel device that has been improved in the same way as the claimed invention…

discloses wherein a touch sensor may be input to an electronic device via a USB port paragraph…

teaches measuring a placement state of the mobile terminal relative to the ground…
XXXXXXX
140

JP2012058910A

(Kentaro Ozawa, 2012)
(Original Assignee) Nec Corp; 日本電気株式会社     携帯端末装置及びプログラム display screen, screen mode 画面全体

virtual bezel 前記操作

XXXXXXXXXXXXXXXX
141

US20120034954A1

(Joseph Akwo Tabe, 2012)
(Original Assignee) Joseph Akwo Tabe     Mega communication and media apparatus configured to prevent brain cancerous deseases and to generate electrical energy first set telecommunications system

thermal sensors electromagnetic signals

receiving touch detection sensitivity, transmitting device

first mode, screen mode thin film solar cell, electronic book

touchscreen area, user input area photovoltaic array, electrical power

electronic device electronic device, sound waves

display system wireless signals

holding pattern discharge cycles

display screen electronic wafer

s hand threshold value

heat signature electric power, energy value

first portion textile fibers

electronic device status display panel touch screen

third set secondary e

second mode said input

information items input data

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches a method of providing a user interface whereas claim…

teaches contact lists and also teaches storing photo images and using icon it would have been obvious to one of…

discloses that it is well known in the art to have portable communications devices with calendar applications a…

teaches that the phone can detect ambiguous input with the second assignment and it can also disambiguate the detected…
XXXXXXXXXXXXXXXXXX
142

US20120154328A1

(Kenji Kono, 2012)
(Original Assignee) Kyocera Corp     

(Current Assignee)
Kyocera Corp
Input apparatus receiving touch input apparatus

status bar visibility display unit

35 U.S.C. 103(a)

35 U.S.C. 102(a)
discloses the touch sense control mechanism has a touch sense surface that is limited in surface size in one of its…

discloses shuffling a playlist based on a generated random thus randomly selecting one of the songs on the playlist to…

discloses one or more tangible device readable media wherein formatting the series of sensor images to form a synthetic…

discloses generating a random based on the measurements received by a motion sensor PAGE…
X
143

US20120019448A1

(Petri Sakari Pitkanen, 2012)
(Original Assignee) Nokia Oyj     

(Current Assignee)
Nokia Oyj
User Interface with Touch Pressure Level Sensing display screen, touchscreen display touch panel function

display system, electronic device status display panel control pad

35 U.S.C. 103(a)

35 U.S.C. 102(b)
teaches wherein the touch substrate defines a plurality of channels between adjacent ones of the plurality of keys see…

teaches the first and second tactile sensations associated with a button press and release…

discloses a base touch input panel upon which the claimed invention is an improvement…

discloses wherein a touch sensor may be input to an electronic device via a USB port paragraph…
XXXXXXXXXXXXXXXXXX
144

US8270148B2

(David Griffith, 2012)
(Original Assignee) VATTERLEDENS INVEST AB     

(Current Assignee)
Apple Inc
Suspension for a pressure sensitive touch display or panel touchscreen display sensitive touch

operating system status bar comprises one

screen mode touch display

XXXX
145

EP2386935A1

(Kuo-Feng Tong, 2011)
(Original Assignee) Research in Motion Ltd     

(Current Assignee)
BlackBerry Ltd
Method of providing tactile feedback and electronic device electronic device electronic device

operating system status bar comprises one

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
discloses wherein the restriction policy restricts access to the managed mobile device by activating a screen lock on…

teaches the tactile input device as explained above for claim…

teaches a flicking touch in a direction in which the flicking touch proceeds by changing the display state of data to…

teaches wherein the control unit drives the plurality of vibrators to generate vibrations at locations corresponding…
XXXXXXXXXX
146

EP2375309A1

(Kuo-Feng Tong, 2011)
(Original Assignee) Research in Motion Ltd     

(Current Assignee)
BlackBerry Ltd
Handheld device with localized delays for triggering tactile feedback electronic device electronic device

s hand threshold value

35 U.S.C. 103(a)

35 U.S.C. 102(b)

35 U.S.C. 102(e)
teaches the controller is configured to change in one direction at least one of the output volume and the replay speed…

teaches the audio output module outputs the first sound source and then outputs the second sound source while the…

teaches changing a position of the special key in software keyboard for a moment…

teaches the first and second tactile sensations associated with a button press and release…
XXXXXXXXX
147

EP2375314A1

(Jason Tyler Griffin, 2011)
(Original Assignee) Research in Motion Ltd     

(Current Assignee)
BlackBerry Ltd
Touch-sensitive device and method of control status bar visibility ambient parameter

first set second function

touchscreen layer first function

XXX




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
FIFTH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, PROCEEDINGS. : 423-428 2002

Publication Year: 2002

Hand Gesture Recognition Using Multi-scale Colour Features, Hierarchical Models And Particle Filtering

Kungliga Tekniska högskolan (KTH Royal Institute of Technology Sweden)

Bretzner, Laptev, Lindeberg, Ieee Computer Society, Ieee Computer Society
US9645663B2
CLAIM 1
. A display system for an electronic device (hand gestures) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (hand gestures) for the gestural hardware on how a multi-touch input will be processed .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (image feature) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image feature (operating system status bar) s at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (hand gestures) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 13
. The electronic device (hand gestures) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 14
. An electronic device (hand gestures) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 15
. The electronic device (hand gestures) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Hand Gesture Recognition Using Multi-scale Colour Features , Hierarchical Models And Particle Filtering . This paper presents algorithms and a prototype system for hand tracking and hand posture recognition . Hand postures are represented in terms of hierarchies of multi-scale colour image features at different scales , with qualitative inter-relations in terms of scale , position and orientation . In each image , detection of multi-scale colour features is performed . Hand states are then simultaneously detected and tracked using particle filtering , with an extension of layered sampling referred to as hierarchical layered sampling . Experiments are presented showing that the performance of the system is substantially improved by performing feature detection in colour space and including a prior with respect to skin colour These components have been integrated into a real-time prototype system , applied to a test problem of controlling consumer electronics using hand gestures (electronic device, electronic device status display panel) . In a simplified demo scenario , this system has been successfully tested by participants at two fairs during 2001 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, VOLS 1 AND 2. : 657-666 2007

Publication Year: 2007

Shift: A Technique For Operating Pen-Based Interfaces Using Touch

University of Toronto, Ontario, Canada

Vogel, Baudisch, Acm
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (visual feedback) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback (user input) , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen (electronic device status display panel) . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen (electronic device status display panel) . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (visual feedback) intended to affect the display of the first portion of the content on the active touchscreen region .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback (user input) , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (visual feedback) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback (user input) , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (visual feedback) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback (user input) , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (visual feedback) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Shift : A Technique For Operating Pen-Based Interfaces Using Touch . Retrieving the stylus of a pen-based device takes time and requires a second hand . Especially for short intermittent interactions many users therefore choose to use their bare fingers . Although convenient , this increases targeting times and error rates . We argue that the main reasons are the occlusion of the target by the user's finger and ambiguity about which part of the finger defines the selection point . We propose a pointing technique we call Shift that is designed to address these issues . When the user touches the screen , Shift creates a callout showing a copy of the occluded screen area and places it in a non-occluded location . The callout also shows a pointer representing the selection point of the finger . Using this visual feedback (user input) , users guide the pointer into the target by moving their finger on the screen surface and commit the target acquisition by lifting the finger . Unlike existing techniques , Shift is only invoked when necessary-over large targets no callout is created and users enjoy the full performance of an unaltered touch screen . We report the results of a user study showing that with Shift participants can select small targets with much lower error rates than an unaided touch screen and that Shift is faster than Offset Cursor for larger targets .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
UIST 2007: PROCEEDINGS OF THE 20TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY. : 269-278 2007

Publication Year: 2007

LucidTouch: A See-Through Mobile Device

Mitsubishi Electric Research Laboratories

Wigdor, Forlines, Baudisch, Barnwell, Shen, Acm
US9645663B2
CLAIM 1
. A display system (higher precision) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 2
. The display system (higher precision) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 3
. The display system (higher precision) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 4
. The display system (higher precision) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 5
. The display system (higher precision) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 6
. The display system (higher precision) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 7
. The display system (higher precision) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 8
. The display system (higher precision) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 9
. The display system (higher precision) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 10
. The display system (higher precision) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 11
. The display system (higher precision) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 12
. The display system (higher precision) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hands onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision (display system) , and the ability to make multi-finger input .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
LucidTouch : A See-Through Mobile Device . Touch is a compelling input modality for interactive devices ;
however , touch input on the small screen of a mobile device is problematic because a user's fingers Occlude the graphical elements lie wishes to work with . In this paper , we present LucidTouch , a mobile device that addresses this limitation by allowing the user to control the application by touching the back of the device . The key to making this usable is what we call pseudo-transparency : by overlaying an image of the user's hand (s hand) s onto the screen , we create the illusion of the mobile device itself being semi-transparent . This pseudo-transparency allows users to accurately acquire tat-gets while not Occluding the screen with their fingers and hand . LucidTouch also Supports multi-touch input , allowing users to operate the device simultaneously with all 10 fingers . We present initial study results that indicate that many users found touching on the back to be preferable to touching on the front , due to reduced occlusion , higher precision , and the ability to make multi-finger input .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP). : 4205-4208 2012

Publication Year: 2012

SILENCE IS GOLDEN: MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS

Rheinisch-Westfälische Technische Hochschule Aachen (RWTH Aachen University)

Rybach, Schlueter, Ney, Ieee
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (speech recognition) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
SILENCE IS GOLDEN : MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS . Models for silence are a fundamental part of continuous speech recognition (first mode) systems . Depending on application requirements , audio data segmentation , and availability of detailed training data annotations , it may be necessary or beneficial to differentiate between other non-speech events , for example breath and background noise . The integration of multiple non-speech models in a WFST-based dynamic network decoder is not straightforward , because these models do not perfectly fit in the transducer framework . This paper describes several options for the transducer construction with multiple non-speech models , shows their considerable different characteristics in memory and runtime efficiency , and analyzes the impact on the recognition performance .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (speech recognition) of response in the active touchscreen region .
SILENCE IS GOLDEN : MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS . Models for silence are a fundamental part of continuous speech recognition (first mode) systems . Depending on application requirements , audio data segmentation , and availability of detailed training data annotations , it may be necessary or beneficial to differentiate between other non-speech events , for example breath and background noise . The integration of multiple non-speech models in a WFST-based dynamic network decoder is not straightforward , because these models do not perfectly fit in the transducer framework . This paper describes several options for the transducer construction with multiple non-speech models , shows their considerable different characteristics in memory and runtime efficiency , and analyzes the impact on the recognition performance .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (speech recognition) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
SILENCE IS GOLDEN : MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS . Models for silence are a fundamental part of continuous speech recognition (first mode) systems . Depending on application requirements , audio data segmentation , and availability of detailed training data annotations , it may be necessary or beneficial to differentiate between other non-speech events , for example breath and background noise . The integration of multiple non-speech models in a WFST-based dynamic network decoder is not straightforward , because these models do not perfectly fit in the transducer framework . This paper describes several options for the transducer construction with multiple non-speech models , shows their considerable different characteristics in memory and runtime efficiency , and analyzes the impact on the recognition performance .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (recognition system) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
SILENCE IS GOLDEN : MODELING NON-SPEECH EVENTS IN WFST-BASED DYNAMIC NETWORK DECODERS . Models for silence are a fundamental part of continuous speech recognition system (s hand) s . Depending on application requirements , audio data segmentation , and availability of detailed training data annotations , it may be necessary or beneficial to differentiate between other non-speech events , for example breath and background noise . The integration of multiple non-speech models in a WFST-based dynamic network decoder is not straightforward , because these models do not perfectly fit in the transducer framework . This paper describes several options for the transducer construction with multiple non-speech models , shows their considerable different characteristics in memory and runtime efficiency , and analyzes the impact on the recognition performance .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP). : 4125-4128 2012

Publication Year: 2012

LATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION

Deutsche Telekom Labs

Sundaram, Bellegarda, Ieee
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (speech recognition) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (temporal alignment) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (new frame) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
LATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION . In recent work , we introduced Latent Perceptual Mapping (LPM) [1] , a new frame (second mode) work for acoustic modeling suitable for template-like speech recognition (first mode) . The basic idea is to leverage a reduced dimensionality description of the observations to derive acoustic prototypes that are closely aligned with perceived acoustic events . Our initial work adopted a bag-of-frames strategy to represent relevant acoustic information within speech segments . In this paper , we extend this approach by better integrating temporal information into the LPM feature extraction . Specifically , we use variable-length units to represent acoustic events at the supra-frame level , in order to benefit from finer temporal alignment (first portion) s when deriving the acoustic prototypes . The outcome can be viewed as a generalization of both conventional template-based approaches and recently proposed sparse representation solutions . This extension is experimentally validated on a context-independent phoneme classification task using the TIMIT corpus .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (speech recognition) of response in the active touchscreen region .
LATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION . In recent work , we introduced Latent Perceptual Mapping (LPM) [1] , a new framework for acoustic modeling suitable for template-like speech recognition (first mode) . The basic idea is to leverage a reduced dimensionality description of the observations to derive acoustic prototypes that are closely aligned with perceived acoustic events . Our initial work adopted a bag-of-frames strategy to represent relevant acoustic information within speech segments . In this paper , we extend this approach by better integrating temporal information into the LPM feature extraction . Specifically , we use variable-length units to represent acoustic events at the supra-frame level , in order to benefit from finer temporal alignments when deriving the acoustic prototypes . The outcome can be viewed as a generalization of both conventional template-based approaches and recently proposed sparse representation solutions . This extension is experimentally validated on a context-independent phoneme classification task using the TIMIT corpus .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (speech recognition) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (temporal alignment) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (new frame) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
LATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION . In recent work , we introduced Latent Perceptual Mapping (LPM) [1] , a new frame (second mode) work for acoustic modeling suitable for template-like speech recognition (first mode) . The basic idea is to leverage a reduced dimensionality description of the observations to derive acoustic prototypes that are closely aligned with perceived acoustic events . Our initial work adopted a bag-of-frames strategy to represent relevant acoustic information within speech segments . In this paper , we extend this approach by better integrating temporal information into the LPM feature extraction . Specifically , we use variable-length units to represent acoustic events at the supra-frame level , in order to benefit from finer temporal alignment (first portion) s when deriving the acoustic prototypes . The outcome can be viewed as a generalization of both conventional template-based approaches and recently proposed sparse representation solutions . This extension is experimentally validated on a context-independent phoneme classification task using the TIMIT corpus .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (new frame) of response in the virtual bezel region .
LATENT PERCEPTUAL MAPPING WITH DATA-DRIVEN VARIABLE-LENGTH ACOUSTIC UNITS FOR TEMPLATE-BASED SPEECH RECOGNITION . In recent work , we introduced Latent Perceptual Mapping (LPM) [1] , a new frame (second mode) work for acoustic modeling suitable for template-like speech recognition . The basic idea is to leverage a reduced dimensionality description of the observations to derive acoustic prototypes that are closely aligned with perceived acoustic events . Our initial work adopted a bag-of-frames strategy to represent relevant acoustic information within speech segments . In this paper , we extend this approach by better integrating temporal information into the LPM feature extraction . Specifically , we use variable-length units to represent acoustic events at the supra-frame level , in order to benefit from finer temporal alignments when deriving the acoustic prototypes . The outcome can be viewed as a generalization of both conventional template-based approaches and recently proposed sparse representation solutions . This extension is experimentally validated on a context-independent phoneme classification task using the TIMIT corpus .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JOURNAL OF MICROMECHANICS AND MICROENGINEERING. 20 (7): - JUL 2010

Publication Year: 2010

Transparent Conductive-polymer Strain Sensors For Touch Input Sheets Of Flexible Displays

東京大学, Tōkyō daigaku (The University of Tokyo)

Takamatsu, Takahata, Muraki, Iwase, Matsumoto, Shimoyama
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (panel display) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
Transparent Conductive-polymer Strain Sensors For Touch Input Sheets Of Flexible Displays . A transparent conductive polymer-based strain-sensor array , designed especially for touch input sheets of flexible displays , was developed . A transparent conductive polymer , namely poly(3 , 4-ethylenedioxythiophene) : polystyrenesulfonate (PEDOT : PSS) , was utilized owing to its strength under repeated mechanical bending . PEDOT : PSS strain sensors with a thickness of 130 nm exhibited light transmittance of 92% , which is the same as the transmittance of ITO electrodes widely used in flat panel display (touchscreen layer) s . We demonstrated that the sensor array on a flexible sheet was able to sustain mechanical bending 300 times at a bending radius of 5 mm . The strain sensor shows a gauge factor of 5 . 2 . The touch point on a flexible sheet could be detected from histograms of the outputs of the strain sensors when the sheet was pushed with an input force of 5 N . The touch input could be detected on the flexible sheet with a curved surface (radius of curvature of 20 mm) . These results show that the developed transparent conductive polymer-based strain-sensor array is applicable to touch input sheets of mechanically bendable displays .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (curved surface) using predefined set of gestures to toggle a full-screen mode .
Transparent Conductive-polymer Strain Sensors For Touch Input Sheets Of Flexible Displays . A transparent conductive polymer-based strain-sensor array , designed especially for touch input sheets of flexible displays , was developed . A transparent conductive polymer , namely poly(3 , 4-ethylenedioxythiophene) : polystyrenesulfonate (PEDOT : PSS) , was utilized owing to its strength under repeated mechanical bending . PEDOT : PSS strain sensors with a thickness of 130 nm exhibited light transmittance of 92% , which is the same as the transmittance of ITO electrodes widely used in flat panel displays . We demonstrated that the sensor array on a flexible sheet was able to sustain mechanical bending 300 times at a bending radius of 5 mm . The strain sensor shows a gauge factor of 5 . 2 . The touch point on a flexible sheet could be detected from histograms of the outputs of the strain sensors when the sheet was pushed with an input force of 5 N . The touch input could be detected on the flexible sheet with a curved surface (status bar visibility) (radius of curvature of 20 mm) . These results show that the developed transparent conductive polymer-based strain-sensor array is applicable to touch input sheets of mechanically bendable displays .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (panel display) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
Transparent Conductive-polymer Strain Sensors For Touch Input Sheets Of Flexible Displays . A transparent conductive polymer-based strain-sensor array , designed especially for touch input sheets of flexible displays , was developed . A transparent conductive polymer , namely poly(3 , 4-ethylenedioxythiophene) : polystyrenesulfonate (PEDOT : PSS) , was utilized owing to its strength under repeated mechanical bending . PEDOT : PSS strain sensors with a thickness of 130 nm exhibited light transmittance of 92% , which is the same as the transmittance of ITO electrodes widely used in flat panel display (touchscreen layer) s . We demonstrated that the sensor array on a flexible sheet was able to sustain mechanical bending 300 times at a bending radius of 5 mm . The strain sensor shows a gauge factor of 5 . 2 . The touch point on a flexible sheet could be detected from histograms of the outputs of the strain sensors when the sheet was pushed with an input force of 5 N . The touch input could be detected on the flexible sheet with a curved surface (radius of curvature of 20 mm) . These results show that the developed transparent conductive polymer-based strain-sensor array is applicable to touch input sheets of mechanically bendable displays .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5. : 352-355 2009

Publication Year: 2009

Back-Off Language Model Compression

Google Inc

Harb, Chelba, Dean, Ghemawat, Isca-inst Speech Commun Assoc
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (speech recognition) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
Back-Off Language Model Compression . With the availability of large amounts of training data relevant to speech recognition (first mode) scenarios , scalability becomes a very productive way to improve language model performance . We present a technique that represents a back-off n-gram language model using arrays of integer values and thus renders it amenable to effective block compression . We propose a few such compression algorithms and evaluate the resulting language model along two dimensions : memory footprint , and speed reduction relative to the uncompressed one . We experimented with a model that uses a 32-bit word vocabulary (at most 4B words) and log-probabilities/back-off-weights quantized to 1 byte , respectively . The best compression algorithm achieves 2 . 6 bytes/n-gram at approximate to 18X slower than uncompressed . For faster LM operation we found it feasible to represent the LM at approximate to 4 . 0 bytes/n-gram , and approximate to 3X slower than the uncompressed LM . The memory footprint of a LM containing one billion n-grams can thus be reduced to 3-4 Gbytes without impacting its speed too much .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (speech recognition) of response in the active touchscreen region .
Back-Off Language Model Compression . With the availability of large amounts of training data relevant to speech recognition (first mode) scenarios , scalability becomes a very productive way to improve language model performance . We present a technique that represents a back-off n-gram language model using arrays of integer values and thus renders it amenable to effective block compression . We propose a few such compression algorithms and evaluate the resulting language model along two dimensions : memory footprint , and speed reduction relative to the uncompressed one . We experimented with a model that uses a 32-bit word vocabulary (at most 4B words) and log-probabilities/back-off-weights quantized to 1 byte , respectively . The best compression algorithm achieves 2 . 6 bytes/n-gram at approximate to 18X slower than uncompressed . For faster LM operation we found it feasible to represent the LM at approximate to 4 . 0 bytes/n-gram , and approximate to 3X slower than the uncompressed LM . The memory footprint of a LM containing one billion n-grams can thus be reduced to 3-4 Gbytes without impacting its speed too much .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (two dimensions) using predefined set of gestures to toggle a full-screen mode .
Back-Off Language Model Compression . With the availability of large amounts of training data relevant to speech recognition scenarios , scalability becomes a very productive way to improve language model performance . We present a technique that represents a back-off n-gram language model using arrays of integer values and thus renders it amenable to effective block compression . We propose a few such compression algorithms and evaluate the resulting language model along two dimensions (status bar visibility) : memory footprint , and speed reduction relative to the uncompressed one . We experimented with a model that uses a 32-bit word vocabulary (at most 4B words) and log-probabilities/back-off-weights quantized to 1 byte , respectively . The best compression algorithm achieves 2 . 6 bytes/n-gram at approximate to 18X slower than uncompressed . For faster LM operation we found it feasible to represent the LM at approximate to 4 . 0 bytes/n-gram , and approximate to 3X slower than the uncompressed LM . The memory footprint of a LM containing one billion n-grams can thus be reduced to 3-4 Gbytes without impacting its speed too much .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (compression algorithm) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
Back-Off Language Model Compression . With the availability of large amounts of training data relevant to speech recognition scenarios , scalability becomes a very productive way to improve language model performance . We present a technique that represents a back-off n-gram language model using arrays of integer values and thus renders it amenable to effective block compression . We propose a few such compression algorithm (third set) s and evaluate the resulting language model along two dimensions : memory footprint , and speed reduction relative to the uncompressed one . We experimented with a model that uses a 32-bit word vocabulary (at most 4B words) and log-probabilities/back-off-weights quantized to 1 byte , respectively . The best compression algorithm achieves 2 . 6 bytes/n-gram at approximate to 18X slower than uncompressed . For faster LM operation we found it feasible to represent the LM at approximate to 4 . 0 bytes/n-gram , and approximate to 3X slower than the uncompressed LM . The memory footprint of a LM containing one billion n-grams can thus be reduced to 3-4 Gbytes without impacting its speed too much .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (speech recognition) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
Back-Off Language Model Compression . With the availability of large amounts of training data relevant to speech recognition (first mode) scenarios , scalability becomes a very productive way to improve language model performance . We present a technique that represents a back-off n-gram language model using arrays of integer values and thus renders it amenable to effective block compression . We propose a few such compression algorithms and evaluate the resulting language model along two dimensions : memory footprint , and speed reduction relative to the uncompressed one . We experimented with a model that uses a 32-bit word vocabulary (at most 4B words) and log-probabilities/back-off-weights quantized to 1 byte , respectively . The best compression algorithm achieves 2 . 6 bytes/n-gram at approximate to 18X slower than uncompressed . For faster LM operation we found it feasible to represent the LM at approximate to 4 . 0 bytes/n-gram , and approximate to 3X slower than the uncompressed LM . The memory footprint of a LM containing one billion n-grams can thus be reduced to 3-4 Gbytes without impacting its speed too much .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
PROCEDINGS OF THE 11TH IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING. : 131-136 2007

Publication Year: 2007

Active, A Tool For Building Intelligent User Interfaces

The École polytechnique fédérale de Lausanne (EPFL)

Guzzoni, Baur, Cheyer, Delpobil
US9645663B2
CLAIM 1
. A display system for an electronic device (hand gestures) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (speech recognition) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition (first mode) and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (speech recognition) of response in the active touchscreen region .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition (first mode) and hand gestures , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (hand gestures) for the gestural hardware on how a multi-touch input will be processed .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (hand gestures) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 13
. The electronic device (hand gestures) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 14
. An electronic device (hand gestures) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (speech recognition) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition (first mode) and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 15
. The electronic device (hand gestures) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (hand gestures) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
Active , A Tool For Building Intelligent User Interfaces . Computers have become affordable , small , omnipresent and are often connected to the Internet . However , despite the availability of such rich environment , user interfaces have not been adapted to fully leverage its potential . To help with complex tasks , a new type of software is needed to provide more user-centric systems that act as "intelligent assistants" , able to interact naturally with human users and with the information environment . Building an intelligent assistant is a difficult task that requires expertise in many fields ranging from artificial intelligence to core software and hardware engineering . We believe that providing a unified tool and methodology to create intelligent software will bring many benefits to this area of research . Our solution , the Active framework , combines an innovative production rule engine with communities of services to model and implement intelligent assistants . In the medical field , our approach is used to build an operating room assistant . Using natural modalities such as speech recognition and hand gestures (electronic device, electronic device status display panel) , it enables surgeons to interact with computer based equipments of the operating room as if they were active members of the team . In a broader context , Active aims to ease the development of intelligent software by making required technologies more accessible .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2013012667A1

Filed: 2012-07-12     Issued: 2013-01-24

Touch sensitive displays

(Original Assignee) Apple Inc.     

Wei Chen, Steven P. Hotelling, John Z. Zhong, Shih-Chang Chang, Steven S. POON
US9645663B2
CLAIM 1
. A display system (control signals) for an electronic device (electronic device) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 2
. The display system (control signals) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 3
. The display system (control signals) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 4
. The display system (control signals) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 5
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 6
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 7
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 8
. The display system (control signals) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 9
. The display system (control signals) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 10
. The display system (control signals) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 11
. The display system (control signals) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 12
. The display system (control signals) according to claim 9 , wherein the display screen comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2013012667A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (front surface) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface (holding pattern) of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (front surface) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2013012667A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface (holding pattern) of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JP2012142033A

Filed: 2012-04-26     Issued: 2012-07-26

多機能ハンドヘルド装置

(Original Assignee) Apple Inc; アップル インコーポレイテッド     

P Hotelling Steve, ホテリング,スティーブ,ピー.
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (制御器) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (動作モード) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。

JP2012142033A
CLAIM 19
前記ハンドヘルド電子機器は、周囲の環境において信号を能動的に探し、該信号に基づいて、ユーザインタフェースまたは動作モード (first mode) を変更するよう構成可能であることを特徴とする請求項1に記載のハンドヘルド電子機器。

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (動作モード) of response in the active touchscreen region .
JP2012142033A
CLAIM 19
前記ハンドヘルド電子機器は、周囲の環境において信号を能動的に探し、該信号に基づいて、ユーザインタフェースまたは動作モード (first mode) を変更するよう構成可能であることを特徴とする請求項1に記載のハンドヘルド電子機器。

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (制御器) .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (制御器) .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (制御器) comprises an electronic device status display panel displaying at least one information item from a set of information items (ハウジング) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。

JP2012142033A
CLAIM 20
ハウジング (information items) と、 前記ハウジング内に位置し、ディスプレイとタッチスクリーンとを含むディスプレイアレンジメントと、 前記ディスプレイアレンジメントのある部分が動かされると、信号を生成するように構成される装置と、 を備えることを特徴とするハンドヘルドコンピューティング装置。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (制御器) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (動作モード) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。

JP2012142033A
CLAIM 19
前記ハンドヘルド電子機器は、周囲の環境において信号を能動的に探し、該信号に基づいて、ユーザインタフェースまたは動作モード (first mode) を変更するよう構成可能であることを特徴とする請求項1に記載のハンドヘルド電子機器。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (制御器) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
JP2012142033A
CLAIM 2
前記ハンドヘルド電子機器は、以下の装置機能性、すなわち、PDA、携帯電話、音楽プレーヤ、カメラ、ビデオプレーヤ、ゲームプレーヤ、ハンドトップ、インターネット端末、GPS受信機および遠隔制御器 (display screen) の2つ以上を含むことを特徴とする請求項1に記載のハンドヘルド電子機器。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120266079A1

Filed: 2012-04-17     Issued: 2012-10-18

Usability of cross-device user interfaces

(Original Assignee) Splashtop Inc     (Current Assignee) Splashtop Inc

Mark Lee, Kay Chen, Yu Qing Cheng
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (defined area) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120266079A1
CLAIM 7
. The apparatus of claim 2 , wherein predicting need for active scrolling uses at least one technique from the set of techniques comprising : using image processing algorithms to detect a scroll bar within an image of said user interface and knowing location of scroll bars and respective controls thereof , interpreting user clicks or touches in a predefined area (first set) surrounding the scroll bar or the controls as engaging a corresponding control element ;
and providing dedicated magnified scrolling controls for overlaying original controls or for overlaying elsewhere in said user interface .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (determine location) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120266079A1
CLAIM 10
. The apparatus of claim 2 , wherein predicting application-specific control needs uses at least one technique from the set of techniques comprising : using image processing algorithms to determine location (electronic device status display panel) s of application-specific controls and knowing locations of said application-specific controls , interpreting user clicks or touches in a predefined area surrounding said application-specific controls as engaging said application-specific controls ;
and providing dedicated application-specific controls for overlaying original controls or for overlaying elsewhere in said user interface , wherein said dedicated application-specific controls duplicate or replace the functionality of said original controls ;
and providing special key combination , gestures , or other inputs that cause window control commands to be sent to said server .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (determine location) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120266079A1
CLAIM 10
. The apparatus of claim 2 , wherein predicting application-specific control needs uses at least one technique from the set of techniques comprising : using image processing algorithms to determine location (electronic device status display panel) s of application-specific controls and knowing locations of said application-specific controls , interpreting user clicks or touches in a predefined area surrounding said application-specific controls as engaging said application-specific controls ;
and providing dedicated application-specific controls for overlaying original controls or for overlaying elsewhere in said user interface , wherein said dedicated application-specific controls duplicate or replace the functionality of said original controls ;
and providing special key combination , gestures , or other inputs that cause window control commands to be sent to said server .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices (user intent) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (user interface element) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120266079A1
CLAIM 1
. An apparatus for improving usability of cross-device user interfaces , comprising : a server configured for being in a remote session with a client device ;
said server configured for having at least one user interface that is used in said remote session ;
said server configured for receiving at least one user interaction event from said client device wherein the user interaction event is intended for said user interface ;
said server configured for predicting a user intent (area comprising vertices) based in part on said received user interaction event and in response to receiving said user interaction event ;
and said server configured for offering a corresponding user interface tool to be used with said user interface or for modifying said user interface in response to said predicted user intent .

US20120266079A1
CLAIM 24
. The apparatus of claim 15 , further comprising : a programming kit for an end-user to define his or her own user interface element (holding pattern) layout and associated gesture mapping actions to said server .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices (user intent) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (user interface element) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120266079A1
CLAIM 1
. An apparatus for improving usability of cross-device user interfaces , comprising : a server configured for being in a remote session with a client device ;
said server configured for having at least one user interface that is used in said remote session ;
said server configured for receiving at least one user interaction event from said client device wherein the user interaction event is intended for said user interface ;
said server configured for predicting a user intent (area comprising vertices) based in part on said received user interaction event and in response to receiving said user interaction event ;
and said server configured for offering a corresponding user interface tool to be used with said user interface or for modifying said user interface in response to said predicted user intent .

US20120266079A1
CLAIM 24
. The apparatus of claim 15 , further comprising : a programming kit for an end-user to define his or her own user interface element (holding pattern) layout and associated gesture mapping actions to said server .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
KR20120092036A

Filed: 2012-02-07     Issued: 2012-08-20

터치 스크린 디스플레이를 구비한 휴대 기기 및 그 제어 방법

(Original Assignee) 삼성전자주식회사     

서준규, 강경아, 곽지연, 김현진, 이주연
US9645663B2
CLAIM 1
. A display system for an electronic device (영역들) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (영역들) for the gestural hardware on how a multi-touch input will be processed .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (영역들) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 13
. The electronic device (영역들) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 14
. An electronic device (영역들) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 15
. The electronic device (영역들) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (이미지) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
KR20120092036A
CLAIM 7
제 2 항에 있어서 , 상기 제1 및 제2 터치 스크린에 상기 달력 영역과 상기 이벤트 목록을 표시하는 도중 , 새로운 이벤트를 등록하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 상기 제1 터치 스크린에 제1 이벤트 생성 창을 표시하고 , 상기 제2 터치 스크린에 제2 이벤트 생성 창을 표시하는 과정과 , 상기 제1 이벤트 생성 창은 이벤트 제목 입력 영역과 , 시작일자 입력 영역과 , 종료일자 입력 영역과 , 종일 선택 영역을 포함하며 , 상기 제2 이벤트 생성 창은 상기 새로운 이벤트에 관련된 위치 입력 영역과 , 관련자 입력 영역과 , 알람 선택 영역과 , 반복 선택 영역과 , 설명 입력 영역과 저장 키 중 적어도 하나를 포함하고 , 상기 입력 영역들 (electronic device) 중 적어도 하나에서 탭 제스처가 감지되면 , 상기 제1 및 제2 터치 스크린 내의 미리 정해지는 영역에 가상 키패드를 표시하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .

KR20120092036A
CLAIM 45
접힐 수 있게 구성된 적어도 하나의 패널 상에 구비되는 제1 터치 스크린 디스플레이와 제2 터치 스크린 디스플레이를 구비한 휴대 기기의 제어 방법에 있어서 , 휴대 기기에 구비된 적어도 하나의 카메라 모듈을 구동하기 위한 제1 명령을 감지하는 과정과 , 상기 제1 명령의 감지에 응답하여 , 제1 터치 스크린에 상기 카메라 모듈에 의한 촬영 영상과 카메라 메뉴를 표시하는 과정과 , 상기 카메라 메뉴는 , 셔터 버튼 , 저장 이미지 (thermal sensors) 불러오기 버튼 , 모드 선택 버튼 및 플래시 선택 버튼 , 타이머 촬영 버튼 중 적어도 하나를 포함하며 , 상기 제1 명령의 감지에 응답하여 , 제2 터치 스크린에 카메라 모드를 변경하기 위한 카메라 모드 메뉴를 표시하는 과정과 , 상기 카메라 모드 메뉴는 , 상기 카메라 모듈을 통해 장면 혹은 모델을 촬영하기 위한 기본 카메라 모드에 대응하는 기본 카메라 버튼과 , 상기 카메라 모듈을 통해 상기 휴대 기기의 사용자 자신을 촬영하기 위한 셀프 카메라 모드에 대응하는 셀프 카메라 버튼과 , 상기 제1 및 제2 터치 스크린 모두를 통해 촬영 영상을 제공하는 듀얼 카메라 모드에 대응하는 듀얼 카메라 버튼과 , 촬영 대상에게 보여지는 터치 스크린을 통해 미리 지정된 애니메이션을 제공하는 베이비 카메라 모드에 대응하는 베이비 카메라 버튼 중 적어도 하나를 포함하며 , 상기 기본 카메라 버튼과 상기 셀프 카메라 버튼과 상기 듀얼 카메라 버튼과 상기 베이비 카메라 버튼 중 하나에서 제1 탭 제스처를 감지하고 해당하는 카메라 모드로 전환하는 과정과 , 상기 제1 및 제2 터치 스크린이 구비된 제1 및 제2 패널이 펼쳐진 상태에서 상기 촬영 영상이 상기 기본 카메라 모드로 동작하는 도중 상기 제1 및 제2 패널이 뒤쪽으로 접혀짐을 감지하면 , 상기 제2 터치 스크린을 턴-오프하고 상기 제1 터치 스크린의 상기 촬영 영상과 상기 카메라 메뉴를 유지하는 과정과 , 상기 제1 및 제2 터치 스크린이 구비된 제1 및 제2 패널이 펼쳐진 상태에서 상기 촬영 영상이 상기 셀프 카메라 모드로 동작하는 도중 상기 제1 및 제2 패널이 뒤쪽으로 접혀짐을 감지하면 , 상기 제1 터치 스크린을 턴-오프하고 상기 제2 터치 스크린에 상기 촬영 영상과 상기 카메라 메뉴를 표시하는 과정과 , 상기 제1 및 제2 터치 스크린이 구비된 제1 및 제2 패널이 펼쳐진 상태에서 상기 촬영 영상이 상기 듀얼 카메라 모드로 동작하는 도중 상기 제1 및 제2 패널이 뒤쪽으로 접혀짐을 감지하면 , 상기 제2 터치 스크린에 상기 촬영 영상을 표시하고 상기 제1 터치 스크린의 상기 촬영 영상과 상기 카메라 메뉴를 유지하는 과정과 , 상기 제1 및 제2 터치 스크린이 구비된 제1 및 제2 패널이 펼쳐진 상태에서 상기 촬영 영상이 상기 베이비 카메라 모드로 동작하는 도중 상기 제1 및 제2 패널이 뒤쪽으로 접혀짐을 감지하면 , 상기 제2 터치 스크린에 미리 지정된 애니메이션을 표시하고 상기 제1 터치 스크린의 상기 촬영 영상과 상기 카메라 메뉴를 유지하는 과정을 포함하는 것을 특징으로 하는 제어 방법 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081319A1

Filed: 2011-09-29     Issued: 2012-04-05

Modifying the display stack upon device open

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Ron Cassar, Paul Edward Reeves, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (display objects) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081319A1
CLAIM 3
. The computer readable medium as defined in claim 2 , wherein the window stack provides a representation of windows or display objects (second set) to the user .

US20120081319A1
CLAIM 9
. The computer readable medium as defined in claim 8 , wherein the logic data structure further comprises a window stack identifier adapted to identify the first portion (first portion) of which the first window is associated .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081319A1
CLAIM 7
. The computer readable medium as defined in claim 6 , wherein the logical data structure comprises one (operating system status bar) or more of : a window identifier adapted to identify the first window in relation to other windows in the window stack ;
a window stack position identifier adapted to identify the position in the window stack for the first window ;
and a display identifier adapted to identify a first touch sensitive display , which is a portion of the composite display of the device , wherein the first window is associated with the first touch sensitive display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081319A1
CLAIM 9
. The computer readable medium as defined in claim 8 , wherein the logic data structure further comprises a window stack identifier adapted to identify the first portion (first portion) of which the first window is associated .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084681A1

Filed: 2011-09-29     Issued: 2012-04-05

Application launch

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Ron Cassar
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084681A1
CLAIM 10
. A non-transitory computer readable medium storing computer executable instructions that when executed by at least one processor perform a method comprising : receiving first input indicating a request to close an application , the application displaying a first window with a first view of the application , wherein the first window is displayed on one of a first touch screen (user input) display of a first screen or a second touch screen display of a second screen ;
in response to receiving the input , closing the first window ;
receiving second input indicating a request to launch the application ;
launching the application ;
and in response to receiving the input indicating a request to launch the application , displaying a second window with the first view of the application .

US20120084681A1
CLAIM 17
. A dual screen communication device , comprising : a first touch sensitive display of a first screen ;
a second touch sensitive display of a second screen ;
a computer readable medium that stores computer executable instructions that when executed by at least one processor perform a method comprising : displaying a window with a view of a multi-screen application , wherein the window is displayed on at least a first portion (first portion) of a first touch sensitive display of a first screen and at least a second portion (second portion, usage frequency) of a second touch sensitive display of a second screen ;
receiving first input indicating a request to close the multi-screen application ;
closing the window of the multi-screen application ;
receiving second input indicating a request to launch the multi-screen application ;
launching the multi-screen application ;
and displaying at least a portion of the view in a second window on one of the first touch sensitive display or the second touch sensitive display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084681A1
CLAIM 10
. A non-transitory computer readable medium storing computer executable instructions that when executed by at least one processor perform a method comprising : receiving first input indicating a request to close an application , the application displaying a first window with a first view of the application , wherein the first window is displayed on one of a first touch screen (user input) display of a first screen or a second touch screen display of a second screen ;
in response to receiving the input , closing the first window ;
receiving second input indicating a request to launch the application ;
launching the application ;
and in response to receiving the input indicating a request to launch the application , displaying a second window with the first view of the application .

US20120084681A1
CLAIM 17
. A dual screen communication device , comprising : a first touch sensitive display of a first screen ;
a second touch sensitive display of a second screen ;
a computer readable medium that stores computer executable instructions that when executed by at least one processor perform a method comprising : displaying a window with a view of a multi-screen application , wherein the window is displayed on at least a first portion (first portion) of a first touch sensitive display of a first screen and at least a second portion (second portion, usage frequency) of a second touch sensitive display of a second screen ;
receiving first input indicating a request to close the multi-screen application ;
closing the window of the multi-screen application ;
receiving second input indicating a request to launch the multi-screen application ;
launching the multi-screen application ;
and displaying at least a portion of the view in a second window on one of the first touch sensitive display or the second touch sensitive display .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084681A1
CLAIM 10
. A non-transitory computer readable medium storing computer executable instructions that when executed by at least one processor perform a method comprising : receiving first input indicating a request to close an application , the application displaying a first window with a first view of the application , wherein the first window is displayed on one of a first touch screen (user input) display of a first screen or a second touch screen display of a second screen ;
in response to receiving the input , closing the first window ;
receiving second input indicating a request to launch the application ;
launching the application ;
and in response to receiving the input indicating a request to launch the application , displaying a second window with the first view of the application .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (first touch screen) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084681A1
CLAIM 10
. A non-transitory computer readable medium storing computer executable instructions that when executed by at least one processor perform a method comprising : receiving first input indicating a request to close an application , the application displaying a first window with a first view of the application , wherein the first window is displayed on one of a first touch screen (user input) display of a first screen or a second touch screen display of a second screen ;
in response to receiving the input , closing the first window ;
receiving second input indicating a request to launch the application ;
launching the application ;
and in response to receiving the input indicating a request to launch the application , displaying a second window with the first view of the application .

US20120084681A1
CLAIM 17
. A dual screen communication device , comprising : a first touch sensitive display of a first screen ;
a second touch sensitive display of a second screen ;
a computer readable medium that stores computer executable instructions that when executed by at least one processor perform a method comprising : displaying a window with a view of a multi-screen application , wherein the window is displayed on at least a first portion of a first touch sensitive display of a first screen and at least a second portion (second portion, usage frequency) of a second touch sensitive display of a second screen ;
receiving first input indicating a request to close the multi-screen application ;
closing the window of the multi-screen application ;
receiving second input indicating a request to launch the multi-screen application ;
launching the multi-screen application ;
and displaying at least a portion of the view in a second window on one of the first touch sensitive display or the second touch sensitive display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084681A1
CLAIM 10
. A non-transitory computer readable medium storing computer executable instructions that when executed by at least one processor perform a method comprising : receiving first input indicating a request to close an application , the application displaying a first window with a first view of the application , wherein the first window is displayed on one of a first touch screen (user input) display of a first screen or a second touch screen display of a second screen ;
in response to receiving the input , closing the first window ;
receiving second input indicating a request to launch the application ;
launching the application ;
and in response to receiving the input indicating a request to launch the application , displaying a second window with the first view of the application .

US20120084681A1
CLAIM 17
. A dual screen communication device , comprising : a first touch sensitive display of a first screen ;
a second touch sensitive display of a second screen ;
a computer readable medium that stores computer executable instructions that when executed by at least one processor perform a method comprising : displaying a window with a view of a multi-screen application , wherein the window is displayed on at least a first portion of a first touch sensitive display of a first screen and at least a second portion (second portion, usage frequency) of a second touch sensitive display of a second screen ;
receiving first input indicating a request to close the multi-screen application ;
closing the window of the multi-screen application ;
receiving second input indicating a request to launch the multi-screen application ;
launching the multi-screen application ;
and displaying at least a portion of the view in a second window on one of the first touch sensitive display or the second touch sensitive display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084725A1

Filed: 2011-09-29     Issued: 2012-04-05

Managing hierarchically related windows in a single display

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Paul E. Reeves, Alexander de Paz, Jared L. Ficklin, Denise Burton, Gregg Wygonik
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084725A1
CLAIM 5
. The method of claim 1 , wherein the selected window and currently displayed window are output by different application (first set) s and wherein the communication device comprises multiple screens , a first screen being operable and a second screen being inoperable .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (play mode) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084725A1
CLAIM 17
. The device of claim 8 , wherein the multi-display communication device is in the portrait display mode (operating system status bar) and wherein the predetermined touch sensitive display is the right-most touch sensitive display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044809A1

Filed: 2011-09-29     Issued: 2012-04-05

Repositioning windows in the pop-up window

(Original Assignee) Imerj LLC     

Sanjiv Sirpal, Martin Gimpl, Eduardo Diego Torres Milano
US9645663B2
CLAIM 1
. A display system (second set, one screen) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set, one screen) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 2
. The display system (second set, one screen) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 3
. The display system (second set, one screen) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 4
. The display system (second set, one screen) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 5
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 6
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 7
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 8
. The display system (second set, one screen) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 9
. The display system (second set, one screen) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 10
. The display system (second set, one screen) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 11
. The display system (second set, one screen) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 12
. The display system (second set, one screen) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044809A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044809A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set (third set) of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044809A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044809A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044809A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044809A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044743A2

Filed: 2011-09-29     Issued: 2012-04-05

Gravity drop

(Original Assignee) Imerj LLC     

Alexander De Paz
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044743A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual- screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044743A2
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set (third set) of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044743A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual- screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044743A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual- screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044743A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual- screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044743A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual- screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044775A1

Filed: 2011-09-29     Issued: 2012-04-05

Keyboard filling one screen or spanning multiple screens of a multiple screen device

(Original Assignee) Imerj LLC     

Sanjiv Sirpal, Robert Csiki
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (operating modes) of response to a first set (first one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (operating modes) of response to a second set (first one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

WO2012044775A1
CLAIM 11
. A device , comprising : a first screen , the first screen including a touch sensitive display area ;
a second screen , the second screen including a touch sensitive display area ;
memory ;
a processor ;
application programming stored in the memory and executed by the processor , wherein the application programming is operable to : in a first operating mode , display a virtual keyboard using a portion of the touch sensitive display area of the first screen and using a portion of the touch sensitive display area of the second screen ;
in a second operating mode , display the virtual keyboard using at least a portion of a first one (first set, second set) of the touch sensitive display area of the first screen and the second screen , and without using any portion of a second one of the touch sensitive display area of the first screen and the second screen .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (operating modes) of response in the active touchscreen region .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (operating modes) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (operating modes) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (operating modes) of response in the virtual bezel region .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044775A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044805A1

Filed: 2011-09-29     Issued: 2012-04-05

Method and system for performing copy-paste operations on a device via user gestures

(Original Assignee) Imerj LLC     

Sanjiv Sirpal, Paul E. Reeves, Alexander De Paz, Jared L. Ficklin, Denise Burton, Gregg Wygonik
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to a second set (following steps) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044805A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .

WO2012044805A1
CLAIM 11
. The computer readable medium of Claim 10 , wherein the source area and the target area each corresponds to a displayed portion of a different application (first set) window ;
wherein each of the different application windows correspond to a different software application installed on the device .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044805A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps) of response in the virtual bezel region .
WO2012044805A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044801A1

Filed: 2011-09-29     Issued: 2012-04-05

Application display transitions between single and multiple displays

(Original Assignee) Imerj LLC     

Martin Gimpl, Paul Edward Reeves, Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set (first set) of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044801A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set (third set) of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044801A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
determining whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
changing a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the step of changing includes a step of modifying a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time (holding pattern) , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044801A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
determining whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
changing a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the step of changing includes a step of modifying a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time (holding pattern) , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044780A1

Filed: 2011-09-29     Issued: 2012-04-05

Single- screen view in response to rotation

(Original Assignee) Imerj LLC     

Rodney Wayne Schrock, Martin Gimpl, Sanjiv Sirpal, John Steven Visosky
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode (first direction) of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044780A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

WO2012044780A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
WO2012044780A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set (third set) of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044780A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

WO2012044780A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (fourth instructions) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044780A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

WO2012044780A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions (response instruction) configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044780A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044780A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044839A2

Filed: 2011-09-29     Issued: 2012-04-05

Smartpad orientation

(Original Assignee) Imerj LLC     

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044839A2
CLAIM 20
. One or more of one or more means for performing the steps of claim 12 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 12 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044839A2
CLAIM 3
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044839A2
CLAIM 3
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044839A2
CLAIM 20
. One or more of one or more means for performing the steps of claim 12 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 12 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044839A2
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044839A2
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044839A2
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044739A2

Filed: 2011-09-29     Issued: 2012-04-05

Rotation gravity drop

(Original Assignee) Imerj LLC     

Alexander De Paz
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044739A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2012044739A2
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set (third set) of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044739A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044739A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044739A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044739A2
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084676A1

Filed: 2011-09-28     Issued: 2012-04-05

Dual screen application visual indicator

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Alexander de Paz
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084676A1
CLAIM 2
. The method of claim 1 , wherein at least one of the first , second , and third inputs is achieved by at least a user input (user input) gesture .

US20120084676A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display ;
instructions configured to receive a third predetermined input that represents an instruction to maximize or minimize a selected one of the first or second desktops or applications ;
and instructions configured to respond to the third predetermined input that causes displaying of the selected one of the first or second desktops or applications in a maximized or minimized condition .

US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084676A1
CLAIM 12
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application within the window stack and to be displayed in either the first or second displays ;
a third set (third set) of instructions responsive to a gesture made by the user to maximize or minimize a selected one of the first or second desktops or applications ;
and wherein the selected one of the first or second desktops or applications is displayed in a maximized or minimized condition .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084676A1
CLAIM 2
. The method of claim 1 , wherein at least one of the first , second , and third inputs is achieved by at least a user input (user input) gesture .

US20120084676A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display ;
instructions configured to receive a third predetermined input that represents an instruction to maximize or minimize a selected one of the first or second desktops or applications ;
and instructions configured to respond to the third predetermined input that causes displaying of the selected one of the first or second desktops or applications in a maximized or minimized condition .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084676A1
CLAIM 2
. The method of claim 1 , wherein at least one of the first , second , and third inputs is achieved by at least a user input (user input) gesture .

US20120084676A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display ;
instructions configured to receive a third predetermined input that represents an instruction to maximize or minimize a selected one of the first or second desktops or applications ;
and instructions configured to respond to the third predetermined input that causes displaying of the selected one of the first or second desktops or applications in a maximized or minimized condition .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084676A1
CLAIM 2
. The method of claim 1 , wherein at least one of the first , second , and third inputs is achieved by at least a user input (user input) gesture .

US20120084676A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display ;
instructions configured to receive a third predetermined input that represents an instruction to maximize or minimize a selected one of the first or second desktops or applications ;
and instructions configured to respond to the third predetermined input that causes displaying of the selected one of the first or second desktops or applications in a maximized or minimized condition .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084676A1
CLAIM 2
. The method of claim 1 , wherein at least one of the first , second , and third inputs is achieved by at least a user input (user input) gesture .

US20120084676A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display ;
instructions configured to receive a third predetermined input that represents an instruction to maximize or minimize a selected one of the first or second desktops or applications ;
and instructions configured to respond to the third predetermined input that causes displaying of the selected one of the first or second desktops or applications in a maximized or minimized condition .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084698A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen with keyboard

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084698A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120084698A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084698A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084698A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084698A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084701A1

Filed: 2011-09-28     Issued: 2012-04-05

Keyboard maximization

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084701A1
CLAIM 12
. The device of claim 11 , further comprising : a hinge , wherein the hinge interconnects the first and second screens , wherein in the first operating mode the device is in an open state , and wherein in the second operating mode the device is in a closed state (first portion) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084701A1
CLAIM 12
. The device of claim 11 , further comprising : a hinge , wherein the hinge interconnects the first and second screens , wherein in the first operating mode the device is in an open state , and wherein in the second operating mode the device is in a closed state (first portion) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081313A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen desktop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081313A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081313A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081313A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081313A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081313A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084675A1

Filed: 2011-09-28     Issued: 2012-04-05

Annunciator drawer

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Paul E. Reeves, Alexander de Paz, Eduardo Diego Torres Milano, Jared L. Ficklin, Denise Burton, Gregg Wygonik
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084675A1
CLAIM 1
. A method of displaying information on a multi-screen device including a plurality of desktops and/or applications each having at least one window , and an annunciator window , the method comprising : receiving , by a processor , a first input that represents an instruction to reveal one of a desktop or application on a first or second display of the multi-screen device and selecting a desktop or application to display on the first or second display ;
displaying , by a processor , the selected desktop or application on the first or second displays ;
displaying , by a processor , a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
configuring , by a processor , the annunciator window to extend across both said first and second displays ;
expanding , by a processor , a size of the annunciator window in response to a user input (user input) gesture wherein the annunciator is expanded as a drawer over a selected one of said first or second displays ;
and wherein the annunciator window displays information selectively across substantially all or part of said expanded annunciator window .

US20120084675A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first input that represents an instruction to determine and reveal a desktop or application on a first or second display of the multi-screen device ;
instructions configured to respond to the first input with an output that cause the desktop or application to be displayed on the first or second displays ;
instructions configured to receive a second input that represents an instruction to display a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
instructions configured to respond to the second input with an output that causes the annunciator window to be displayed in a configuration extending across both of said first and second displays ;
instructions configured to receive a third input that represents an instruction to expand the annunciator window over a selected one of said first or second displays ;
instructions configured to respond to the third input with an output that causes an annunciator window drawer to be expanded over the selected one of said first or second displays .

US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set (second set, display system) of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084675A1
CLAIM 16
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user ;
an annunciator display extending across a portion said first and second display areas of said first and second displays for displaying at least one of a device status , a connectivity status , and a messaging status ;
a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user and subsequently displayed on selected ones of said first or second displays ;
a second set of instructions configured to determine information to be displayed in said annunciator display based on selected user gestures and respective states of said device status , a connectivity status , and a messaging status ;
and a third set (third set) of instructions configured to expand an area of the annunciator display defining a drawer , as presented on the first or second displays in response to a user gesture executed on a selected display ;


US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084675A1
CLAIM 1
. A method of displaying information on a multi-screen device including a plurality of desktops and/or applications each having at least one window , and an annunciator window , the method comprising : receiving , by a processor , a first input that represents an instruction to reveal one of a desktop or application on a first or second display of the multi-screen device and selecting a desktop or application to display on the first or second display ;
displaying , by a processor , the selected desktop or application on the first or second displays ;
displaying , by a processor , a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
configuring , by a processor , the annunciator window to extend across both said first and second displays ;
expanding , by a processor , a size of the annunciator window in response to a user input (user input) gesture wherein the annunciator is expanded as a drawer over a selected one of said first or second displays ;
and wherein the annunciator window displays information selectively across substantially all or part of said expanded annunciator window .

US20120084675A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first input that represents an instruction to determine and reveal a desktop or application on a first or second display of the multi-screen device ;
instructions configured to respond to the first input with an output that cause the desktop or application to be displayed on the first or second displays ;
instructions configured to receive a second input that represents an instruction to display a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
instructions configured to respond to the second input with an output that causes the annunciator window to be displayed in a configuration extending across both of said first and second displays ;
instructions configured to receive a third input that represents an instruction to expand the annunciator window over a selected one of said first or second displays ;
instructions configured to respond to the third input with an output that causes an annunciator window drawer to be expanded over the selected one of said first or second displays .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084675A1
CLAIM 1
. A method of displaying information on a multi-screen device including a plurality of desktops and/or applications each having at least one window , and an annunciator window , the method comprising : receiving , by a processor , a first input that represents an instruction to reveal one of a desktop or application on a first or second display of the multi-screen device and selecting a desktop or application to display on the first or second display ;
displaying , by a processor , the selected desktop or application on the first or second displays ;
displaying , by a processor , a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
configuring , by a processor , the annunciator window to extend across both said first and second displays ;
expanding , by a processor , a size of the annunciator window in response to a user input (user input) gesture wherein the annunciator is expanded as a drawer over a selected one of said first or second displays ;
and wherein the annunciator window displays information selectively across substantially all or part of said expanded annunciator window .

US20120084675A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first input that represents an instruction to determine and reveal a desktop or application on a first or second display of the multi-screen device ;
instructions configured to respond to the first input with an output that cause the desktop or application to be displayed on the first or second displays ;
instructions configured to receive a second input that represents an instruction to display a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
instructions configured to respond to the second input with an output that causes the annunciator window to be displayed in a configuration extending across both of said first and second displays ;
instructions configured to receive a third input that represents an instruction to expand the annunciator window over a selected one of said first or second displays ;
instructions configured to respond to the third input with an output that causes an annunciator window drawer to be expanded over the selected one of said first or second displays .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084675A1
CLAIM 1
. A method of displaying information on a multi-screen device including a plurality of desktops and/or applications each having at least one window , and an annunciator window , the method comprising : receiving , by a processor , a first input that represents an instruction to reveal one of a desktop or application on a first or second display of the multi-screen device and selecting a desktop or application to display on the first or second display ;
displaying , by a processor , the selected desktop or application on the first or second displays ;
displaying , by a processor , a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
configuring , by a processor , the annunciator window to extend across both said first and second displays ;
expanding , by a processor , a size of the annunciator window in response to a user input (user input) gesture wherein the annunciator is expanded as a drawer over a selected one of said first or second displays ;
and wherein the annunciator window displays information selectively across substantially all or part of said expanded annunciator window .

US20120084675A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first input that represents an instruction to determine and reveal a desktop or application on a first or second display of the multi-screen device ;
instructions configured to respond to the first input with an output that cause the desktop or application to be displayed on the first or second displays ;
instructions configured to receive a second input that represents an instruction to display a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
instructions configured to respond to the second input with an output that causes the annunciator window to be displayed in a configuration extending across both of said first and second displays ;
instructions configured to receive a third input that represents an instruction to expand the annunciator window over a selected one of said first or second displays ;
instructions configured to respond to the third input with an output that causes an annunciator window drawer to be expanded over the selected one of said first or second displays .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084675A1
CLAIM 1
. A method of displaying information on a multi-screen device including a plurality of desktops and/or applications each having at least one window , and an annunciator window , the method comprising : receiving , by a processor , a first input that represents an instruction to reveal one of a desktop or application on a first or second display of the multi-screen device and selecting a desktop or application to display on the first or second display ;
displaying , by a processor , the selected desktop or application on the first or second displays ;
displaying , by a processor , a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
configuring , by a processor , the annunciator window to extend across both said first and second displays ;
expanding , by a processor , a size of the annunciator window in response to a user input (user input) gesture wherein the annunciator is expanded as a drawer over a selected one of said first or second displays ;
and wherein the annunciator window displays information selectively across substantially all or part of said expanded annunciator window .

US20120084675A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first input that represents an instruction to determine and reveal a desktop or application on a first or second display of the multi-screen device ;
instructions configured to respond to the first input with an output that cause the desktop or application to be displayed on the first or second displays ;
instructions configured to receive a second input that represents an instruction to display a pre-configured annunciator window having information therein showing at least one of a device status , a connectivity status , and a messaging status ;
instructions configured to respond to the second input with an output that causes the annunciator window to be displayed in a configuration extending across both of said first and second displays ;
instructions configured to receive a third input that represents an instruction to expand the annunciator window over a selected one of said first or second displays ;
instructions configured to respond to the third input with an output that causes an annunciator window drawer to be expanded over the selected one of said first or second displays .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084674A1

Filed: 2011-09-28     Issued: 2012-04-05

Allowing multiple orientations in dual screen view

(Original Assignee) Imerj LLC     (Current Assignee) Z124

John Steven Visosky
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen (display state) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system, main display, display area, user inputs) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 6
. The method of claim 4 , wherein the at least one input comprises a plurality of inputs including application , device , and user inputs (touchscreen layer, touchscreen display, receiving touch) .

US20120084674A1
CLAIM 7
. The method of claim 1 , further including : determining that an input is received from the user instructing that the first orientation of one of the desktops or applications should be locked , and in response to determining that the user input is received , causing the data from another selected one of the first or second desktops or applications to remain display (touchscreen layer, touchscreen display, receiving touch) ed in the first orientation while the data from the selected one of the first or second desktops is displayed in the second different orientation .

US20120084674A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display, receiving touch) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area (touchscreen layer, touchscreen display, receiving touch) ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (display state) .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (display state) .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen (display state) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set (second set, display system) of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set (third set) of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (display state) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system, main display, display area, user inputs) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 6
. The method of claim 4 , wherein the at least one input comprises a plurality of inputs including application , device , and user inputs (touchscreen layer, touchscreen display, receiving touch) .

US20120084674A1
CLAIM 7
. The method of claim 1 , further including : determining that an input is received from the user instructing that the first orientation of one of the desktops or applications should be locked , and in response to determining that the user input is received , causing the data from another selected one of the first or second desktops or applications to remain display (touchscreen layer, touchscreen display, receiving touch) ed in the first orientation while the data from the selected one of the first or second desktops is displayed in the second different orientation .

US20120084674A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display, receiving touch) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area (touchscreen layer, touchscreen display, receiving touch) ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (display state) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120084674A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has been placed in a dual display state (display screen) with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
determining that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
determining that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , causing the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system, main display, display area, user inputs) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084674A1
CLAIM 6
. The method of claim 4 , wherein the at least one input comprises a plurality of inputs including application , device , and user inputs (touchscreen layer, touchscreen display, receiving touch) .

US20120084674A1
CLAIM 7
. The method of claim 1 , further including : determining that an input is received from the user instructing that the first orientation of one of the desktops or applications should be locked , and in response to determining that the user input is received , causing the data from another selected one of the first or second desktops or applications to remain display (touchscreen layer, touchscreen display, receiving touch) ed in the first orientation while the data from the selected one of the first or second desktops is displayed in the second different orientation .

US20120084674A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display, receiving touch) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area (touchscreen layer, touchscreen display, receiving touch) ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system, main display, display area, user inputs) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084674A1
CLAIM 6
. The method of claim 4 , wherein the at least one input comprises a plurality of inputs including application , device , and user inputs (touchscreen layer, touchscreen display, receiving touch) .

US20120084674A1
CLAIM 7
. The method of claim 1 , further including : determining that an input is received from the user instructing that the first orientation of one of the desktops or applications should be locked , and in response to determining that the user input is received , causing the data from another selected one of the first or second desktops or applications to remain display (touchscreen layer, touchscreen display, receiving touch) ed in the first orientation while the data from the selected one of the first or second desktops is displayed in the second different orientation .

US20120084674A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display, receiving touch) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area (touchscreen layer, touchscreen display, receiving touch) ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system, main display, display area, user inputs) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084674A1
CLAIM 6
. The method of claim 4 , wherein the at least one input comprises a plurality of inputs including application , device , and user inputs (touchscreen layer, touchscreen display, receiving touch) .

US20120084674A1
CLAIM 7
. The method of claim 1 , further including : determining that an input is received from the user instructing that the first orientation of one of the desktops or applications should be locked , and in response to determining that the user input is received , causing the data from another selected one of the first or second desktops or applications to remain display (touchscreen layer, touchscreen display, receiving touch) ed in the first orientation while the data from the selected one of the first or second desktops is displayed in the second different orientation .

US20120084674A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display, receiving touch) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .

US20120084674A1
CLAIM 15
. A multi-screen user device , comprising : a first display including a first display area (touchscreen layer, touchscreen display, receiving touch) ;
a second display including a second display area ;
an independent display orientation element configured to manage a plurality of inputs corresponding to data to determine the orientation of desktops or applications displayed on the first and second displays ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been placed in a dual display state with a first desktop or application displayed on a first display and second desktop or application displayed on a second display , wherein data from the first and second desktops/application are displayed on their respective first and second displays ;
a second set of instructions configured to determine that the first and second desktops or applications are each displayed in respective first portrait or landscape orientations ;
a third set of instructions configured to determine that an input is received from at least one of the desktops or applications , or from the device , or from a user , instructing that the first orientations of one of the desktops or applications should change to a second different orientation ;
and in response to determining that an input is received , a fourth set of instructions configured to cause the data from a selected one of the first or second desktops or applications to be displayed in the second different orientation .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081317A1

Filed: 2011-09-28     Issued: 2012-04-05

Method and system for performing copy-paste operations on a device via user gestures

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared Ficklin, Denise Burton, Gregg Wygonik
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to a second set (following steps) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081317A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .

US20120081317A1
CLAIM 11
. The computer readable medium of claim 10 , wherein the source area and the target area each corresponds to a displayed portion of a different application (first set) window ;
wherein each of the different application windows correspond to a different software application installed on the device .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081317A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps) of response in the virtual bezel region .
US20120081317A1
CLAIM 10
. A computer readable medium for performing a copy-paste operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input from a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area from which displayed data is to be copied ;
receiving an input from a finger drag gesture for identifying a target area of the second display screen into which data from the source area is to be copied , wherein the target area corresponds to a location of the second display screen where the drag gesture is last detected before it ceases to be detected ;
wherein the finger drag gesture includes a continuous contact with the first display screen from the first finger gesture ;
and copying the data to the target area .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081316A1

Filed: 2011-09-28     Issued: 2012-04-05

Off-screen gesture dismissable keyboard

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (display output) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081316A1
CLAIM 4
. The method of claim 3 , wherein a first portion of the set of virtual keys is presented by a first touch screen included as part of a first screen of the device , wherein a second portion (second portion, usage frequency) of the set of virtual keys is presented by a second touch screen included as part of a second screen of the device , and wherein the gesture capture region is included as part of one of the first screen of the device and the second screen of the device .

US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US20120081316A1
CLAIM 17
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for dismissing virtual key sets , the computer-executable instructions comprising : instructions to display a first set (first set) of virtual keys on a touch screen ;
instructions to determine whether a user input (user input) has been received within a gesture capture region ;
instructions to discontinue the display of the first set of virtual keys on the touch screen in response to a determination that user input has been received within the gesture capture region .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (display output) .
US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (display output) .
US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (display output) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (display output) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081316A1
CLAIM 4
. The method of claim 3 , wherein a first portion of the set of virtual keys is presented by a first touch screen included as part of a first screen of the device , wherein a second portion (second portion, usage frequency) of the set of virtual keys is presented by a second touch screen included as part of a second screen of the device , and wherein the gesture capture region is included as part of one of the first screen of the device and the second screen of the device .

US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US20120081316A1
CLAIM 17
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for dismissing virtual key sets , the computer-executable instructions comprising : instructions to display a first set of virtual keys on a touch screen ;
instructions to determine whether a user input (user input) has been received within a gesture capture region ;
instructions to discontinue the display of the first set of virtual keys on the touch screen in response to a determination that user input has been received within the gesture capture region .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (display output) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120081316A1
CLAIM 12
. The device of claim 11 , wherein the gesture capture region is not operative to display output (display screen, screen mode) to the user .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081316A1
CLAIM 17
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for dismissing virtual key sets , the computer-executable instructions comprising : instructions to display a first set of virtual keys on a touch screen ;
instructions to determine whether a user input (user input) has been received within a gesture capture region ;
instructions to discontinue the display of the first set of virtual keys on the touch screen in response to a determination that user input has been received within the gesture capture region .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (second areas) , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081316A1
CLAIM 4
. The method of claim 3 , wherein a first portion of the set of virtual keys is presented by a first touch screen included as part of a first screen of the device , wherein a second portion (second portion, usage frequency) of the set of virtual keys is presented by a second touch screen included as part of a second screen of the device , and wherein the gesture capture region is included as part of one of the first screen of the device and the second screen of the device .

US20120081316A1
CLAIM 13
. The device of claim 11 , wherein the first and second areas (touchscreen area) of the screen do not overlap .

US20120081316A1
CLAIM 17
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for dismissing virtual key sets , the computer-executable instructions comprising : instructions to display a first set of virtual keys on a touch screen ;
instructions to determine whether a user input (user input) has been received within a gesture capture region ;
instructions to discontinue the display of the first set of virtual keys on the touch screen in response to a determination that user input has been received within the gesture capture region .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081316A1
CLAIM 4
. The method of claim 3 , wherein a first portion of the set of virtual keys is presented by a first touch screen included as part of a first screen of the device , wherein a second portion (second portion, usage frequency) of the set of virtual keys is presented by a second touch screen included as part of a second screen of the device , and wherein the gesture capture region is included as part of one of the first screen of the device and the second screen of the device .

US20120081316A1
CLAIM 17
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for dismissing virtual key sets , the computer-executable instructions comprising : instructions to display a first set of virtual keys on a touch screen ;
instructions to determine whether a user input (user input) has been received within a gesture capture region ;
instructions to discontinue the display of the first set of virtual keys on the touch screen in response to a determination that user input has been received within the gesture capture region .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081399A1

Filed: 2011-09-28     Issued: 2012-04-05

Visible card stack

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081399A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081399A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081399A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081399A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081399A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081398A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Alexander de Paz, Martin Gimpl, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081398A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081398A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081398A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081398A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081398A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081398A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081398A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081398A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081398A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081315A1

Filed: 2011-09-28     Issued: 2012-04-05

Keyboard spanning multiple screens

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (first one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081315A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
in a first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen .

US20120081315A1
CLAIM 13
. The device of claim 12 , wherein the application programming is further operable to : in a second operating mode , display the virtual keyboard using at least a portion of a first one (first set, second set) of the touch sensitive display area of the first screen and the second screen , and without using any portion of a second one of the touch sensitive display area of the first screen and the second screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081315A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
in a first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081854A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen desktop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081854A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081854A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081854A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081854A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081854A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081403A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Alexander de Paz, Martin Gimpl, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081403A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081403A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081403A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081403A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081403A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081403A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081403A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081403A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081403A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081400A1

Filed: 2011-09-28     Issued: 2012-04-05

Dual-screen view in response to rotation

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Rodney Wayne Schrock, Martin Gimpl, Sanjiv Sirpal, John Steven Visosky
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081400A1
CLAIM 9
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
instructions configured to determine that the device has been rotated a second time after the first time ;
and instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set (second set, display system) of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081400A1
CLAIM 15
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
a second set of instructions configured to determine that the device has been rotated a second time after the first time ;
and a third set (third set) of instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081400A1
CLAIM 9
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
instructions configured to determine that the device has been rotated a second time after the first time ;
and instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081400A1
CLAIM 9
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
instructions configured to determine that the device has been rotated a second time after the first time ;
and instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081400A1
CLAIM 9
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
instructions configured to determine that the device has been rotated a second time after the first time ;
and instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081400A1
CLAIM 9
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a multi-screen device has been rotated a first time from a first open state to a second different open state , wherein data from a first and second application are displayed while the device is in the first open state and wherein data from the first application is not displayed while the device is in the second open state ;
instructions configured to determine that the device has been rotated a second time after the first time ;
and instructions configured to automatically cause data from the second application and a third application to be displayed on the device in response to determining that the device has been rotated the second time .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081314A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen desktop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081314A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081314A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081314A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081314A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081314A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084710A1

Filed: 2011-09-28     Issued: 2012-04-05

Repositioning windows in the pop-up window

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, Eduardo Diego Torres Milano
US9645663B2
CLAIM 1
. A display system (second set, one screen) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set, one screen) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 2
. The display system (second set, one screen) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 3
. The display system (second set, one screen) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 4
. The display system (second set, one screen) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 5
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 6
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 7
. The display system (second set, one screen) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 8
. The display system (second set, one screen) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 9
. The display system (second set, one screen) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 10
. The display system (second set, one screen) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 11
. The display system (second set, one screen) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 12
. The display system (second set, one screen) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084710A1
CLAIM 3
. The method of claim 1 , wherein the third input is a user gesture comprising a spread gesture , and wherein the spread gesture is executed on off-screen areas of said first and seconds displays with one finger of a user placed on one screen (second set, display system) , and another finger of the user placed on the other screen .

US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set (second set, display system) of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084710A1
CLAIM 17
. A multi-screen user device , comprising : a first display including a first display area ;
a second display including a second display area ;
a first user input gesture area of the first display ;
a second user input gesture area of the second display , wherein the first and second user input gesture areas are configured to accept input from a user . a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a number and identity of desktops or applications selected to be run by a user ;
a second set of instructions configured to determine , based on the number and identity of the desktops or applications running , a window stack comprising a logical representation of the desktops and applications within an ordered group whereby a user can selectively move a selected desktop or application to either a visible position on another display or a non-visible position ;
and a third set (third set) of instructions responsive to a gesture made by the user that represents an instruction to launch an application manager feature , wherein the applications or desktops are displayed on the first and second displays , and a management window is displayed showing all other applications or desktops currently running ;
a fourth set of instructions responsive to one or more gestures made by the user that represent an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position , and wherein the window stack is displayed in a new order in response to said fourth set of instructions .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084710A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084710A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084710A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084710A1
CLAIM 13
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to determine and reveal a first desktop or application on a first display of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to determine and reveal a second desktop or application on a second display of the multi-screen device ;
instructions configured to respond to the first and second predetermined inputs with outputs that cause the first desktop or application to be displayed on the first display and cause the second desktop or application to be displayed on the second display instructions configured to receive a third predetermined input that represents an instruction to launch an application manager feature ;
instructions configured to respond to the third predetermined input that causes a display of a management window showing all other applications or desktops currently running ;
instructions configured to receive a fourth predetermined input that represents an instruction to selectively move one of said first and second applications or desktops , or said other applications or desktops , within the window stack to a different position ;
instructions configured to respond to the fourth predetermined input that causes displaying of the window stack in a new order .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084680A1

Filed: 2011-09-28     Issued: 2012-04-05

Gesture capture for manipulation of presentations on one or more device displays

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Ron Cassar, Maxim Marintchenko, Nikhil Swaminathan
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to a second set (following steps) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084680A1
CLAIM 14
. A computer readable medium for configuring a device having at least one display screen for displaying one or more of screen display presentations and for displaying one or more operably different screen display presentations , wherein the screen display presentations are able to be manipulated via user gesture input differently from the operably different screen display presentations , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : receiving a first user gesture input to a gesture capture area associated with the at least one screen , the gesture capture area being separate from the at least one display screen ;
determining whether a first of the one or more of screen display presentations is displayed on the at least one screen ;
when the first screen display presentation is determined to be displayed , perform a step of interpreting the user gesture input for determining a predetermined operation to apply to the first screen display presentation ;
performing the predetermined operation ;
when none of the one or more screen display presentations are determined to be displayed in the step of determining , perform a step of preventing the device from using the user gesture input to change the at least one display screen ;
receiving a second user gesture input in response to a user gesture input that includes user contact directly to the at least one screen ;
second determining whether one of the operably different screen display presentations is displayed on the at least one screen ;
when the one operably different screen display presentation is determined to be displayed on the at least one screen , performing a step of second interpreting the second user gesture input for determining a predetermined second operation to apply to the one operably different screen display presentation ;
and when none of the operably different screen display presentations are determined in the step of second determining , perform a step of second preventing the device from using the second user gesture input to change the at least one display screen .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items (second determining) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084680A1
CLAIM 1
. A method for configuring a device having at least one display screen for displaying one or more of screen display presentations and for displaying one or more operably different screen display presentations , wherein the screen display presentations are able to be manipulated via user gesture input differently from the operably different screen display presentations , comprising : receiving a first user gesture input to a gesture capture area associated with the at least one screen , the gesture capture area being separate from the at least one display screen ;
determining whether a first of the one or more of screen display presentations is displayed on the at least one screen ;
when the first screen display presentation is determined to be displayed , perform a step of interpreting the user gesture input for determining a predetermined operation to apply to the first screen display presentation ;
performing the predetermined operation ;
when none of the one or more screen display presentations are determined to be displayed in the step of determining , perform a step of preventing the device from using the user gesture input to change the at least one display screen ;
receiving a second user gesture input in response to a user gesture input that includes user contact directly to the at least one screen ;
second determining (information items) whether one of the operably different screen display presentations is displayed on the at least one screen ;
when the one operably different screen display presentation is determined to be displayed on the at least one screen , performing a step of second interpreting the second user gesture input for determining a predetermined second operation to apply to the one operably different screen display presentation ;
and when none of the operably different screen display presentations are determined in the step of second determining , perform a step of second preventing the device from using the second user gesture input to change the at least one display screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084680A1
CLAIM 14
. A computer readable medium for configuring a device having at least one display screen for displaying one or more of screen display presentations and for displaying one or more operably different screen display presentations , wherein the screen display presentations are able to be manipulated via user gesture input differently from the operably different screen display presentations , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : receiving a first user gesture input to a gesture capture area associated with the at least one screen , the gesture capture area being separate from the at least one display screen ;
determining whether a first of the one or more of screen display presentations is displayed on the at least one screen ;
when the first screen display presentation is determined to be displayed , perform a step of interpreting the user gesture input for determining a predetermined operation to apply to the first screen display presentation ;
performing the predetermined operation ;
when none of the one or more screen display presentations are determined to be displayed in the step of determining , perform a step of preventing the device from using the user gesture input to change the at least one display screen ;
receiving a second user gesture input in response to a user gesture input that includes user contact directly to the at least one screen ;
second determining whether one of the operably different screen display presentations is displayed on the at least one screen ;
when the one operably different screen display presentation is determined to be displayed on the at least one screen , performing a step of second interpreting the second user gesture input for determining a predetermined second operation to apply to the one operably different screen display presentation ;
and when none of the operably different screen display presentations are determined in the step of second determining , perform a step of second preventing the device from using the second user gesture input to change the at least one display screen .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps) of response in the virtual bezel region .
US20120084680A1
CLAIM 14
. A computer readable medium for configuring a device having at least one display screen for displaying one or more of screen display presentations and for displaying one or more operably different screen display presentations , wherein the screen display presentations are able to be manipulated via user gesture input differently from the operably different screen display presentations , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : receiving a first user gesture input to a gesture capture area associated with the at least one screen , the gesture capture area being separate from the at least one display screen ;
determining whether a first of the one or more of screen display presentations is displayed on the at least one screen ;
when the first screen display presentation is determined to be displayed , perform a step of interpreting the user gesture input for determining a predetermined operation to apply to the first screen display presentation ;
performing the predetermined operation ;
when none of the one or more screen display presentations are determined to be displayed in the step of determining , perform a step of preventing the device from using the user gesture input to change the at least one display screen ;
receiving a second user gesture input in response to a user gesture input that includes user contact directly to the at least one screen ;
second determining whether one of the operably different screen display presentations is displayed on the at least one screen ;
when the one operably different screen display presentation is determined to be displayed on the at least one screen , performing a step of second interpreting the second user gesture input for determining a predetermined second operation to apply to the one operably different screen display presentation ;
and when none of the operably different screen display presentations are determined in the step of second determining , perform a step of second preventing the device from using the second user gesture input to change the at least one display screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084694A1

Filed: 2011-09-28     Issued: 2012-04-05

Method and system for performing drag and drop operations on a device via user gestures

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared Ficklin, Denise Burton, Gregg Wygonik
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to a second set (following steps) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084694A1
CLAIM 10
. A computer readable medium for performing a drag and drop operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input of a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area of the first display screen , wherein the source area includes data to be copied ;
receiving an input of a finger drag gesture for identifying a target area of the second display screen into which the data from the source data is to be copied , wherein the finger drag gesture extends across a boundary between the first display screen and the second display screen , wherein the first and second display screens are foldable relative to one another along the boundary ;
wherein the target area corresponds to a location on the second display screen where the drag gesture is last detected before it ceases to be detected ;
changing a display of the target area for identifying the target area to a user as able to receive the data from the source area ;
and copying the data into the target area .

US20120084694A1
CLAIM 11
. The computer readable medium of claim 10 , wherein the source area and the target area each correspond to a displayed portion of a different application (first set) window ;
wherein each of the different application windows corresponds to a different software application installed on the device .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084694A1
CLAIM 10
. A computer readable medium for performing a drag and drop operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input of a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area of the first display screen , wherein the source area includes data to be copied ;
receiving an input of a finger drag gesture for identifying a target area of the second display screen into which the data from the source data is to be copied , wherein the finger drag gesture extends across a boundary between the first display screen and the second display screen , wherein the first and second display screens are foldable relative to one another along the boundary ;
wherein the target area corresponds to a location on the second display screen where the drag gesture is last detected before it ceases to be detected ;
changing a display of the target area for identifying the target area to a user as able to receive the data from the source area ;
and copying the data into the target area .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps) of response in the virtual bezel region .
US20120084694A1
CLAIM 10
. A computer readable medium for performing a drag and drop operation using user finger gesture inputs to first and second display screens of a device , wherein the device includes a folding mechanism attached to each of the first and second display screens for providing the first and seconds display screens in a folded configuration wherein the first and second display screens face in opposite directions , and in an unfolded configuration wherein the first and second display screens face in a substantially same direction , comprising : machine instructions for performing the following steps (second mode, second set, second portion) : determining that the first and second display screens are in the unfolded configuration ;
receiving an input of a first finger gesture to the first display screen , wherein the first finger gesture input is for identifying a source area of the first display screen , wherein the source area includes data to be copied ;
receiving an input of a finger drag gesture for identifying a target area of the second display screen into which the data from the source data is to be copied , wherein the finger drag gesture extends across a boundary between the first display screen and the second display screen , wherein the first and second display screens are foldable relative to one another along the boundary ;
wherein the target area corresponds to a location on the second display screen where the drag gesture is last detected before it ceases to be detected ;
changing a display of the target area for identifying the target area to a user as able to receive the data from the source area ;
and copying the data into the target area .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084700A1

Filed: 2011-09-28     Issued: 2012-04-05

Keyboard dismissed on closure of device

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (first one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084700A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
in a first operating mode , presenting a virtual keyboard , at least a first portion of a first window , and at least a first portion of a second window , wherein a keyboard focus is on one of the first window or the second window , wherein both the first and second screens are in view of a user ;
receiving input from the user , wherein the input includes closing the device to place the device in a second operating mode , wherein only one of the first and second screens is in view of the user ;
determining whether the user input (user input) to place the device in the second operating mode removes the window with keyboard focus from the view of the user ;
in response to determining that the user input to place the device in the second operating mode does remove the window with keyboard focus from the view of user , presenting at least one of the first portion of the window that does not have keyboard focus and a second portion of the window that does not have keyboard focus , without a presentation of the virtual keyboard .

US20120084700A1
CLAIM 12
. The device of claim 11 , further comprising : a hinge , wherein the hinge interconnects the first and second screens , wherein in the first operating mode the device is in an open state , and wherein in the second operating mode the device is in a closed state (first portion) .

US20120084700A1
CLAIM 20
. The computer readable medium of claim 17 , wherein in the first operating mode the virtual keyboard is presented using a first portion of the first touch sensitive screen and a first portion of the second touch sensitive screen , wherein the first portion of the first window is presented by a second portion of a first one (first set, second set) of the first and second touch sensitive screens , and wherein at least a first portion of a second window is presented by a second portion of the second touch sensitive screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084700A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
in a first operating mode , presenting a virtual keyboard , at least a first portion of a first window , and at least a first portion of a second window , wherein a keyboard focus is on one of the first window or the second window , wherein both the first and second screens are in view of a user ;
receiving input from the user , wherein the input includes closing the device to place the device in a second operating mode , wherein only one of the first and second screens is in view of the user ;
determining whether the user input (user input) to place the device in the second operating mode removes the window with keyboard focus from the view of the user ;
in response to determining that the user input to place the device in the second operating mode does remove the window with keyboard focus from the view of user , presenting at least one of the first portion of the window that does not have keyboard focus and a second portion of the window that does not have keyboard focus , without a presentation of the virtual keyboard .

US20120084700A1
CLAIM 12
. The device of claim 11 , further comprising : a hinge , wherein the hinge interconnects the first and second screens , wherein in the first operating mode the device is in an open state , and wherein in the second operating mode the device is in a closed state (first portion) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084700A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
in a first operating mode , presenting a virtual keyboard , at least a first portion of a first window , and at least a first portion of a second window , wherein a keyboard focus is on one of the first window or the second window , wherein both the first and second screens are in view of a user ;
receiving input from the user , wherein the input includes closing the device to place the device in a second operating mode , wherein only one of the first and second screens is in view of the user ;
determining whether the user input (user input) to place the device in the second operating mode removes the window with keyboard focus from the view of the user ;
in response to determining that the user input to place the device in the second operating mode does remove the window with keyboard focus from the view of user , presenting at least one of the first portion of the window that does not have keyboard focus and a second portion of the window that does not have keyboard focus , without a presentation of the virtual keyboard .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084700A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
in a first operating mode , presenting a virtual keyboard , at least a first portion of a first window , and at least a first portion of a second window , wherein a keyboard focus is on one of the first window or the second window , wherein both the first and second screens are in view of a user ;
receiving input from the user , wherein the input includes closing the device to place the device in a second operating mode , wherein only one of the first and second screens is in view of the user ;
determining whether the user input (user input) to place the device in the second operating mode removes the window with keyboard focus from the view of the user ;
in response to determining that the user input to place the device in the second operating mode does remove the window with keyboard focus from the view of user , presenting at least one of the first portion of the window that does not have keyboard focus and a second portion of the window that does not have keyboard focus , without a presentation of the virtual keyboard .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084700A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
in a first operating mode , presenting a virtual keyboard , at least a first portion of a first window , and at least a first portion of a second window , wherein a keyboard focus is on one of the first window or the second window , wherein both the first and second screens are in view of a user ;
receiving input from the user , wherein the input includes closing the device to place the device in a second operating mode , wherein only one of the first and second screens is in view of the user ;
determining whether the user input (user input) to place the device in the second operating mode removes the window with keyboard focus from the view of the user ;
in response to determining that the user input to place the device in the second operating mode does remove the window with keyboard focus from the view of user , presenting at least one of the first portion of the window that does not have keyboard focus and a second portion of the window that does not have keyboard focus , without a presentation of the virtual keyboard .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084721A1

Filed: 2011-09-28     Issued: 2012-04-05

Window stack modification in response to orientation change

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Paul Edward Reeves, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084721A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has rotated from a first orientation to a second orientation ;
instructions configured to determine a relative size and orientation of at least one window in a window stack stored on the multi-screen device based on the based on the rotation of the multi-screen device ;
and instructions configured to control the display of the at least one window stored in the window stack based on the determined relative size and orientation of the at least one window in the window stack .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084721A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has rotated from a first orientation to a second orientation ;
instructions configured to determine a relative size and orientation of at least one window in a window stack stored on the multi-screen device based on the based on the rotation of the multi-screen device ;
and instructions configured to control the display of the at least one window stored in the window stack based on the determined relative size and orientation of the at least one window in the window stack .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084721A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has rotated from a first orientation to a second orientation ;
instructions configured to determine a relative size and orientation of at least one window in a window stack stored on the multi-screen device based on the based on the rotation of the multi-screen device ;
and instructions configured to control the display of the at least one window stored in the window stack based on the determined relative size and orientation of the at least one window in the window stack .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084721A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has rotated from a first orientation to a second orientation ;
instructions configured to determine a relative size and orientation of at least one window in a window stack stored on the multi-screen device based on the based on the rotation of the multi-screen device ;
and instructions configured to control the display of the at least one window stored in the window stack based on the determined relative size and orientation of the at least one window in the window stack .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084721A1
CLAIM 8
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has rotated from a first orientation to a second orientation ;
instructions configured to determine a relative size and orientation of at least one window in a window stack stored on the multi-screen device based on the based on the rotation of the multi-screen device ;
and instructions configured to control the display of the at least one window stored in the window stack based on the determined relative size and orientation of the at least one window in the window stack .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081270A1

Filed: 2011-09-28     Issued: 2012-04-05

Dual screen application behaviour

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Ron Cassar, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081270A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a dual-screen application on a first screen and a second screen of the multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to deactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that deactivates the second screen display and causes the dual-screen application to be displayed in a single-screen mode on the first screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to reactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that reactivates the second screen display and causes the dual-screen application to continue to be displayed in a single-screen mode on the first screen of the multi-screen device .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081270A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a dual-screen application on a first screen and a second screen of the multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to deactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that deactivates the second screen display and causes the dual-screen application to be displayed in a single-screen mode on the first screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to reactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that reactivates the second screen display and causes the dual-screen application to continue to be displayed in a single-screen mode on the first screen of the multi-screen device .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081270A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a dual-screen application on a first screen and a second screen of the multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to deactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that deactivates the second screen display and causes the dual-screen application to be displayed in a single-screen mode on the first screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to reactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that reactivates the second screen display and causes the dual-screen application to continue to be displayed in a single-screen mode on the first screen of the multi-screen device .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081270A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a dual-screen application on a first screen and a second screen of the multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to deactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that deactivates the second screen display and causes the dual-screen application to be displayed in a single-screen mode on the first screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to reactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that reactivates the second screen display and causes the dual-screen application to continue to be displayed in a single-screen mode on the first screen of the multi-screen device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081270A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a dual-screen application on a first screen and a second screen of the multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to deactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that deactivates the second screen display and causes the dual-screen application to be displayed in a single-screen mode on the first screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to reactivate displaying to the second screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that reactivates the second screen display and causes the dual-screen application to continue to be displayed in a single-screen mode on the first screen of the multi-screen device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081271A1

Filed: 2011-09-28     Issued: 2012-04-05

Application display transitions between single and multiple displays

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Paul Reeves, Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set (first set) of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set (second set, display system) of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081271A1
CLAIM 10
. A non-transitory computer-readable medium having machine instructions stored thereon , the instructions comprising : a first set of the instructions configured to determine that a multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
a second set of the instructions configured to determine whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
and a third set (third set) of the instructions configured to change a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the third set of the instructions are configured to modify a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081271A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
determining whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
changing a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the step of changing includes a step of modifying a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time (holding pattern) , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081271A1
CLAIM 1
. A method of controlling data displayed by a multi-screen device , comprising : determining that the multi-screen device has moved from a first physical orientation to a different second physical orientation , wherein the first and second physical orientations differ by the first physical orientation being one of : folded or in a landscape orientation ;
determining whether a first screen and a second screen of the multi-screen device are each displaying content for a same application ;
changing a display of the application to conform to a predetermined display configuration of the of the application on at least one of the first and second screens , wherein the display configuration is dependent upon the second physical orientation and a result of the step of determining ;
wherein the step of changing includes a step of modifying a display of the application : (a) from being displayed on only the first screen to being displayed on both the first and second screens at a same time (holding pattern) , or (b) from being displayed on both the first and second screens at a same time to being displayed on only the first screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081280A1

Filed: 2011-09-28     Issued: 2012-04-05

Single-screen view in response to rotation

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Rodney Wayne Schrock, Martin Gimpl, Sanjiv Sirpal, John Steven Visosky
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode (first direction) of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081280A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

US20120081280A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20120081280A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set (second set, display system) of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set (third set) of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081280A1
CLAIM 9
. The method of claim 1 , wherein the first rotation input corresponds to a rotation of the device by about 90 degrees in a first direction (first mode) and wherein the second rotation input corresponds to a rotation of the device by about 90 degrees in a second direction that is opposite the first direction .

US20120081280A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (fourth instructions) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081280A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US20120081280A1
CLAIM 17
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a sensor configured to produce an electrical signal indicative of a rotation of the user device ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
a second set of instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
a third set of instructions configured to reference display rules for the first application after receiving the second rotation input ;
and a fourth instructions (response instruction) configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081280A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081280A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first rotation input , the first rotation input corresponding to an indication that the multi-screen device has been rotated a first time from a first open state to a different second open state , wherein data from a first application is displayed while the device is in the first open state on a first screen of the device and not on a second screen of the device , and wherein data from the first application is displayed on the first and second screens of the device in the second open state ;
instructions configured to receive a second rotation input , the second rotation input corresponding to an indication that the device has been rotated a second time after the first time ;
instructions configured to reference display rules for the first application after receiving the second rotation input ;
and instructions configured to render a display of the first application on one of the first and second screens according to the display rules for the first application .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081289A1

Filed: 2011-09-28     Issued: 2012-04-05

Keyboard filling one screen or spanning multiple screens of a multiple screen device

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Robert Csiki
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (operating modes) of response to a first set (first one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (operating modes) of response to a second set (first one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US20120081289A1
CLAIM 11
. A device , comprising : a first screen , the first screen including a touch sensitive display area ;
a second screen , the second screen including a touch sensitive display area ;
memory ;
a processor ;
application programming stored in the memory and executed by the processor , wherein the application programming is operable to : in a first operating mode , display a virtual keyboard using a portion of the touch sensitive display area of the first screen and using a portion of the touch sensitive display area of the second screen ;
in a second operating mode , display the virtual keyboard using at least a portion of a first one (first set, second set) of the touch sensitive display area of the first screen and the second screen , and without using any portion of a second one of the touch sensitive display area of the first screen and the second screen .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (operating modes) of response in the active touchscreen region .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (operating modes) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (operating modes) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (operating modes) of response in the virtual bezel region .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes (first mode, second mode, screen mode) ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081289A1
CLAIM 1
. A method , comprising : providing a device having first and second touch screens ;
selecting from first and second operating modes ;
in the first operating mode , first presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first touch screen , and wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second touch screen ;
in the second operating mode , second presenting a virtual keyboard , wherein all of the virtual keyboard is presented using one of the first touch screen and the second touch screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081292A1

Filed: 2011-09-28     Issued: 2012-04-05

Desktop reveal

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (different application) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081292A1
CLAIM 6
. The smartpad of claim 5 , wherein the at least two portions are different sizes and relate to two different application (first set) s .

US20120081292A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081292A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081292A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081292A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion, n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081292A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US20120081292A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion (first portion) for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081292A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081292A1
CLAIM 20
. The method of claim 8 , further comprising receiving a request to enter multi-application mode , and , in response thereto , splitting the display into at least two logical portions , a first portion for displaying the first window and a second portion (second portion, usage frequency) for displaying the second window .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081293A1

Filed: 2011-09-28     Issued: 2012-04-05

Gravity drop rules and keyboard display on a multiple screen device

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (touch screens) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (user selection) of response to a second set (first one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen, user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081293A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
positioning the device in a first device orientation ;
presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first screen , wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second screen , wherein first information is presented using a second portion of the first screen , and wherein second information is presented using a second portion of the second screen ;
receiving input from a user , the input including rotating the device to a second device orientation ;
determining that a focus of the user is on the first information presented by the first screen ;
determining a direction of the rotation of the device ;
the method further including at least one of : in response to determining that the direction of rotation of the device is towards the first screen : discontinuing presenting the virtual keyboard ;
discontinuing presenting the first information ;
expanding the presentation of the second information , wherein after expanding the presentation of the second information the first and second screens at least partially display at least one of the second information or information related to the second information ;
in response to determining that the direction of rotation of the device is away from the first screen : continuing to present the virtual keyboard ;
discontinuing presenting the second information ;
continuing to present the first information .

US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US20120081293A1
CLAIM 10
. The method of claim 1 , wherein in the first device orientation the presentation of the virtual keyboard is modified as compared to the presentation of the virtual keyboard in the second device orientation , and wherein the modification includes aligning keys included in the virtual keyboard to accommodate a gap between a first group of the keys included in the virtual keyboard that are presented by the first touch screen (user input) and a second group of keys included in the virtual keyboard that are presented by the second touch screen .

US20120081293A1
CLAIM 12
. The device of claim 11 , wherein the direction of rotation is not towards the first one (first set, second set) of the first and second screens identified as having the focus of the user .

US20120081293A1
CLAIM 13
. The device of claim 12 , further comprising : in the second device orientation , displaying the virtual keyboard using the second screen ;
receiving a user input (user input) to discontinue the display of the virtual keyboard ;
in response to the user input to discontinue the display of the virtual keyboard , displaying using the second screen information related to information displayed by the first screen .

US20120081293A1
CLAIM 19
. The computer readable medium of claim 17 , further comprising : instructions to determine a focus of the user from a user selection (second mode, screen mode) of one of the first and second information .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch screens) .
US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch screens) .
US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch screens) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch screens) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (user selection) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen, user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081293A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
positioning the device in a first device orientation ;
presenting a virtual keyboard , wherein a first portion (first portion) of the virtual keyboard is presented using a first portion of the first screen , wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second screen , wherein first information is presented using a second portion of the first screen , and wherein second information is presented using a second portion of the second screen ;
receiving input from a user , the input including rotating the device to a second device orientation ;
determining that a focus of the user is on the first information presented by the first screen ;
determining a direction of the rotation of the device ;
the method further including at least one of : in response to determining that the direction of rotation of the device is towards the first screen : discontinuing presenting the virtual keyboard ;
discontinuing presenting the first information ;
expanding the presentation of the second information , wherein after expanding the presentation of the second information the first and second screens at least partially display at least one of the second information or information related to the second information ;
in response to determining that the direction of rotation of the device is away from the first screen : continuing to present the virtual keyboard ;
discontinuing presenting the second information ;
continuing to present the first information .

US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US20120081293A1
CLAIM 10
. The method of claim 1 , wherein in the first device orientation the presentation of the virtual keyboard is modified as compared to the presentation of the virtual keyboard in the second device orientation , and wherein the modification includes aligning keys included in the virtual keyboard to accommodate a gap between a first group of the keys included in the virtual keyboard that are presented by the first touch screen (user input) and a second group of keys included in the virtual keyboard that are presented by the second touch screen .

US20120081293A1
CLAIM 13
. The device of claim 12 , further comprising : in the second device orientation , displaying the virtual keyboard using the second screen ;
receiving a user input (user input) to discontinue the display of the virtual keyboard ;
in response to the user input to discontinue the display of the virtual keyboard , displaying using the second screen information related to information displayed by the first screen .

US20120081293A1
CLAIM 19
. The computer readable medium of claim 17 , further comprising : instructions to determine a focus of the user from a user selection (second mode, screen mode) of one of the first and second information .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch screens) , the gestural software application configured to produce the second mode (user selection) of response in the virtual bezel region .
US20120081293A1
CLAIM 2
. The method of claim 1 , wherein in the first orientation the first and second screens are in a portrait mode , and wherein in the second operating mode the first and second touch screens (display screen) are in a landscape mode .

US20120081293A1
CLAIM 19
. The computer readable medium of claim 17 , further comprising : instructions to determine a focus of the user from a user selection (second mode, screen mode) of one of the first and second information .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (first touch screen, user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081293A1
CLAIM 10
. The method of claim 1 , wherein in the first device orientation the presentation of the virtual keyboard is modified as compared to the presentation of the virtual keyboard in the second device orientation , and wherein the modification includes aligning keys included in the virtual keyboard to accommodate a gap between a first group of the keys included in the virtual keyboard that are presented by the first touch screen (user input) and a second group of keys included in the virtual keyboard that are presented by the second touch screen .

US20120081293A1
CLAIM 13
. The device of claim 12 , further comprising : in the second device orientation , displaying the virtual keyboard using the second screen ;
receiving a user input (user input) to discontinue the display of the virtual keyboard ;
in response to the user input to discontinue the display of the virtual keyboard , displaying using the second screen information related to information displayed by the first screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (first touch screen, user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081293A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
positioning the device in a first device orientation ;
presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first screen , wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second screen , wherein first information is presented using a second portion of the first screen , and wherein second information is presented using a second portion of the second screen ;
receiving input from a user , the input including rotating the device to a second device orientation ;
determining that a focus of the user is on the first information presented by the first screen ;
determining a direction of the rotation of the device ;
the method further including at least one of : in response to determining that the direction of rotation of the device is towards the first screen : discontinuing presenting the virtual keyboard ;
discontinuing presenting the first information ;
expanding the presentation of the second information , wherein after expanding the presentation of the second information the first and second screens at least partially display at least one of the second information or information related to the second information ;
in response to determining that the direction of rotation of the device is away from the first screen : continuing to present the virtual keyboard ;
discontinuing presenting the second information ;
continuing to present the first information .

US20120081293A1
CLAIM 10
. The method of claim 1 , wherein in the first device orientation the presentation of the virtual keyboard is modified as compared to the presentation of the virtual keyboard in the second device orientation , and wherein the modification includes aligning keys included in the virtual keyboard to accommodate a gap between a first group of the keys included in the virtual keyboard that are presented by the first touch screen (user input) and a second group of keys included in the virtual keyboard that are presented by the second touch screen .

US20120081293A1
CLAIM 13
. The device of claim 12 , further comprising : in the second device orientation , displaying the virtual keyboard using the second screen ;
receiving a user input (user input) to discontinue the display of the virtual keyboard ;
in response to the user input to discontinue the display of the virtual keyboard , displaying using the second screen information related to information displayed by the first screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (first touch screen, user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081293A1
CLAIM 1
. A method , comprising : providing a device having first and second screens ;
positioning the device in a first device orientation ;
presenting a virtual keyboard , wherein a first portion of the virtual keyboard is presented using a first portion of the first screen , wherein a second portion (second portion, usage frequency) of the virtual keyboard is presented using a first portion of the second screen , wherein first information is presented using a second portion of the first screen , and wherein second information is presented using a second portion of the second screen ;
receiving input from a user , the input including rotating the device to a second device orientation ;
determining that a focus of the user is on the first information presented by the first screen ;
determining a direction of the rotation of the device ;
the method further including at least one of : in response to determining that the direction of rotation of the device is towards the first screen : discontinuing presenting the virtual keyboard ;
discontinuing presenting the first information ;
expanding the presentation of the second information , wherein after expanding the presentation of the second information the first and second screens at least partially display at least one of the second information or information related to the second information ;
in response to determining that the direction of rotation of the device is away from the first screen : continuing to present the virtual keyboard ;
discontinuing presenting the second information ;
continuing to present the first information .

US20120081293A1
CLAIM 10
. The method of claim 1 , wherein in the first device orientation the presentation of the virtual keyboard is modified as compared to the presentation of the virtual keyboard in the second device orientation , and wherein the modification includes aligning keys included in the virtual keyboard to accommodate a gap between a first group of the keys included in the virtual keyboard that are presented by the first touch screen (user input) and a second group of keys included in the virtual keyboard that are presented by the second touch screen .

US20120081293A1
CLAIM 13
. The device of claim 12 , further comprising : in the second device orientation , displaying the virtual keyboard using the second screen ;
receiving a user input (user input) to discontinue the display of the virtual keyboard ;
in response to the user input to discontinue the display of the virtual keyboard , displaying using the second screen information related to information displayed by the first screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081311A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad orientation

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081311A1
CLAIM 20
. One or more of one or more means for performing the steps of claim 12 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 12 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081311A1
CLAIM 3
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081311A1
CLAIM 3
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081311A1
CLAIM 20
. One or more of one or more means for performing the steps of claim 12 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 12 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081311A1
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081311A1
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081311A1
CLAIM 8
. The smartpad of claim 1 , wherein the smartpad has a single application mode , the single application mode displaying an application in full screen (touchscreen display) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081312A1

Filed: 2011-09-28     Issued: 2012-04-05

Smartpad split screen

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Alexander de Paz, Martin Gimpl, John Steven Visosky
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081312A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081312A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081312A1
CLAIM 2
. The smartpad of claim 1 , wherein the display is a touch screen (electronic device status display panel) display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (n storage) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081312A1
CLAIM 19
. One or more of one or more means for performing the steps of claim 8 and a non-transitory computer-readable information storage (first portion) media having stored thereon instructions , that when executed by a processor , perform the steps of claim 8 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081312A1
CLAIM 1
. A smartpad comprising : a screen ;
and a display , the display configured to display content from a docked multi-screen device , wherein a dual screen application on the docked multi-screen device is displayed in full screen (touchscreen display) on the display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081312A1
CLAIM 1
. A smartpad comprising : a screen ;
and a display , the display configured to display content from a docked multi-screen device , wherein a dual screen application on the docked multi-screen device is displayed in full screen (touchscreen display) on the display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081312A1
CLAIM 1
. A smartpad comprising : a screen ;
and a display , the display configured to display content from a docked multi-screen device , wherein a dual screen application on the docked multi-screen device is displayed in full screen (touchscreen display) on the display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044545A2

Filed: 2011-09-23     Issued: 2012-04-05

Gesture controlled screen repositioning for one or more displays

(Original Assignee) Imerj, Llc     

Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044545A2
CLAIM 2
. The method as recited in Claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

WO2012044545A2
CLAIM 12
. The method as recited in Claim 1 1 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

WO2012044545A2
CLAIM 14
. The method as recited in Claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion (first portion) of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
WO2012044545A2
CLAIM 2
. The method as recited in Claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (single display) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044545A2
CLAIM 14
. The method as recited in Claim 1 , wherein the plurality of displays comprise separate portions of a single display (operating system status bar) , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion of the single display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044545A2
CLAIM 2
. The method as recited in Claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

WO2012044545A2
CLAIM 12
. The method as recited in Claim 1 1 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

WO2012044545A2
CLAIM 14
. The method as recited in Claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion (first portion) of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044545A2
CLAIM 12
. The method as recited in Claim 1 1 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (first touch screen) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044545A2
CLAIM 12
. The method as recited in Claim 1 1 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

WO2012044545A2
CLAIM 14
. The method as recited in Claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044545A2
CLAIM 12
. The method as recited in Claim 1 1 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

WO2012044545A2
CLAIM 14
. The method as recited in Claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044516A2

Filed: 2011-09-22     Issued: 2012-04-05

Multi-screen user interface with orientation based control

(Original Assignee) Imerj, Llc     

Alex De Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set (different one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044516A2
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .

WO2012044516A2
CLAIM 12
. The method according to Claim 7 , further comprising : receiving a gesture input at the handheld computing device when the handheld computing device is in one of the second orientation and the third orientation ;
and altering one of the first display and second display to display a different one (first set) of the first application and the second application in response to the receiving .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
WO2012044516A2
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044516A2
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044515A2

Filed: 2011-09-22     Issued: 2012-04-05

Gesture based application management

(Original Assignee) Imerj, Llc     

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044515A2
CLAIM 3
. The method according to Claim 2 , wherein the application manager includes icons representing the one or more applications associated with the first display in a first portion (first portion) of the application manager and icons representing applications associated with the second display in a second portion (second portion, usage frequency) of the application manager .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044515A2
CLAIM 3
. The method according to Claim 2 , wherein the application manager includes icons representing the one or more applications associated with the first display in a first portion (first portion) of the application manager and icons representing applications associated with the second display in a second portion (second portion, usage frequency) of the application manager .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044515A2
CLAIM 3
. The method according to Claim 2 , wherein the application manager includes icons representing the one or more applications associated with the first display in a first portion of the application manager and icons representing applications associated with the second display in a second portion (second portion, usage frequency) of the application manager .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044515A2
CLAIM 3
. The method according to Claim 2 , wherein the application manager includes icons representing the one or more applications associated with the first display in a first portion of the application manager and icons representing applications associated with the second display in a second portion (second portion, usage frequency) of the application manager .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012044510A2

Filed: 2011-09-21     Issued: 2012-04-05

User interface with independent drawer control

(Original Assignee) Imerj, Llc     

Paul E. Reeves
US9645663B2
CLAIM 1
. A display system (first location) for an electronic device comprising : a touch-sensitive display screen (status bar) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 2
. The display system (first location) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 3
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 4
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 5
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (status bar) .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 6
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (status bar) .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 7
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 8
. The display system (first location) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 9
. The display system (first location) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 10
. The display system (first location) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 11
. The display system (first location) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 12
. The display system (first location) according to claim 9 , wherein the display screen (status bar) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2012044510A2
CLAIM 3
. The method according to Claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (status bar) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (status bar) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012044510A2
CLAIM 9
. The method according to Claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081309A1

Filed: 2011-09-01     Issued: 2012-04-05

Displayed image transition indicator

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Alexander de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to a second set (following steps) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081309A1
CLAIM 5
. The method of claim 1 , wherein the transition indicator is unable to receive or provide dynamic user input (user input) or output , respectively , wherein the transition indicator has an appearance different from the displayed image , and wherein the transition indicator covers only part of the second touch-sensitive display before the displayed image is moved to cover fully the second touch-sensitive display .

US20120081309A1
CLAIM 8
. A non-transient computer readable medium comprising microprocessor executable instructions operable to perform at least the following steps (second mode, second set, second portion) : receiving , by at least one of a gesture capture region and a touch sensitive display , a gesture , the gesture indicating that a displayed image is to be moved from a first touch sensitive display to a second touch sensitive display ;
and in response and prior to movement of the displayed image to the second touch sensitive display , moving a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image ;
and thereafter moving the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081309A1
CLAIM 5
. The method of claim 1 , wherein the transition indicator is unable to receive or provide dynamic user input (user input) or output , respectively , wherein the transition indicator has an appearance different from the displayed image , and wherein the transition indicator covers only part of the second touch-sensitive display before the displayed image is moved to cover fully the second touch-sensitive display .

US20120081309A1
CLAIM 8
. A non-transient computer readable medium comprising microprocessor executable instructions operable to perform at least the following steps (second mode, second set, second portion) : receiving , by at least one of a gesture capture region and a touch sensitive display , a gesture , the gesture indicating that a displayed image is to be moved from a first touch sensitive display to a second touch sensitive display ;
and in response and prior to movement of the displayed image to the second touch sensitive display , moving a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image ;
and thereafter moving the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps) of response in the virtual bezel region .
US20120081309A1
CLAIM 8
. A non-transient computer readable medium comprising microprocessor executable instructions operable to perform at least the following steps (second mode, second set, second portion) : receiving , by at least one of a gesture capture region and a touch sensitive display , a gesture , the gesture indicating that a displayed image is to be moved from a first touch sensitive display to a second touch sensitive display ;
and in response and prior to movement of the displayed image to the second touch sensitive display , moving a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image ;
and thereafter moving the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081309A1
CLAIM 5
. The method of claim 1 , wherein the transition indicator is unable to receive or provide dynamic user input (user input) or output , respectively , wherein the transition indicator has an appearance different from the displayed image , and wherein the transition indicator covers only part of the second touch-sensitive display before the displayed image is moved to cover fully the second touch-sensitive display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (red color) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081309A1
CLAIM 5
. The method of claim 1 , wherein the transition indicator is unable to receive or provide dynamic user input (user input) or output , respectively , wherein the transition indicator has an appearance different from the displayed image , and wherein the transition indicator covers only part of the second touch-sensitive display before the displayed image is moved to cover fully the second touch-sensitive display .

US20120081309A1
CLAIM 6
. The method of claim 1 , wherein , when the gesture is received , the transition indicator is absent from a displayed image stack associated with the first and second touch sensitive displays , wherein the displayed image and transition indicator are simultaneously in active display positions on the first and second touch sensitive displays , respectively , prior to initiation of movement of the displayed image and wherein the transition indicator comprises a user configured color (usage frequency) , pattern , design , and/or photograph .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (red color) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081309A1
CLAIM 5
. The method of claim 1 , wherein the transition indicator is unable to receive or provide dynamic user input (user input) or output , respectively , wherein the transition indicator has an appearance different from the displayed image , and wherein the transition indicator covers only part of the second touch-sensitive display before the displayed image is moved to cover fully the second touch-sensitive display .

US20120081309A1
CLAIM 6
. The method of claim 1 , wherein , when the gesture is received , the transition indicator is absent from a displayed image stack associated with the first and second touch sensitive displays , wherein the displayed image and transition indicator are simultaneously in active display positions on the first and second touch sensitive displays , respectively , prior to initiation of movement of the displayed image and wherein the transition indicator comprises a user configured color (usage frequency) , pattern , design , and/or photograph .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081310A1

Filed: 2011-09-01     Issued: 2012-04-05

Pinch gesture to swap windows

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Rodney W. Schrock
US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081310A1
CLAIM 2
. The method of claim 1 , wherein the gesture is a pinch gesture and wherein the first touch sensitive display ceases to display the first displayed image at substantially the same time (holding pattern) as the second touch display ceases to display the second displayed image .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081310A1
CLAIM 2
. The method of claim 1 , wherein the gesture is a pinch gesture and wherein the first touch sensitive display ceases to display the first displayed image at substantially the same time (holding pattern) as the second touch display ceases to display the second displayed image .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084714A1

Filed: 2011-08-31     Issued: 2012-04-05

Window stack models for multi-screen displays

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl, Ron Cassar, John Steven Visosky, Robert Csiki
US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084714A1
CLAIM 6
. The computer readable medium as defined in claim 1 , wherein the logical data structure comprises one (operating system status bar) or more of : a window identifier adapted to identify the active window in relation to other windows in the window stack ;
a window stack position identifier adapted to identify the position in the window stack for the active window ;
and a display identifier adapted to identify which one of two displays of the multi-screen device the active window is associated .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081397A1

Filed: 2011-08-31     Issued: 2012-04-05

Rotation gravity drop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Alexander de Paz
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081397A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set (second set, display system) of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081397A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a sensor configured to produce an electrical signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
a second set of instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and a third set (third set) of instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081397A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081397A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081397A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081397A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that the multi-screen device has been rotated from a first open state to a second different open state ;
instructions configured to determine a relative position of a first screen that originally displayed data from a first application when the multi-screen device was in the first open state and a second screen that was not displaying data from the first application when the multi-screen device was in the first open state ;
and instructions configured to control data displayed on the second screen in the second open state based on the determined relative position of the first screen and second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081267A1

Filed: 2011-08-31     Issued: 2012-04-05

Desktop reveal expansion

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081267A1
CLAIM 2
. The method of claim 1 , wherein the first predetermined input comprises at least one of the following : i) a user input (user input) gesture ;
ii) a combination of user input gestures ;
iii) a memory output ;
iv) a response to a programmed condition ;
and v) hardware timers .

US20120081267A1
CLAIM 7
. The method of claim 6 , wherein the third desktop is virtually displayed on the second screen while the multi-screen device is in a closed state (first portion) .

US20120081267A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on a first screen of the multi-screen device ;
instructions configured to determine a first desktop to display on the first screen ;
instructions configured to determine a second desktop to display on a second screen of the multi-screen device ;
and instructions configured to respond to the first predetermined input with an output that causes the first desktop to be displayed on the first screen and which also causes the second desktop to be at least one of (i) actually displayed on the second screen and (ii) virtually displayed on the second screen .

US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081267A1
CLAIM 16
. A multi-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , whether the hinge is in a first position or second position ;
and a second set (second set, display system) of instructions configured to determine , based on the first signal , whether to actually display or virtually display a second desktop on the second screen .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (closed state) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081267A1
CLAIM 2
. The method of claim 1 , wherein the first predetermined input comprises at least one of the following : i) a user input (user input) gesture ;
ii) a combination of user input gestures ;
iii) a memory output ;
iv) a response to a programmed condition ;
and v) hardware timers .

US20120081267A1
CLAIM 7
. The method of claim 6 , wherein the third desktop is virtually displayed on the second screen while the multi-screen device is in a closed state (first portion) .

US20120081267A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on a first screen of the multi-screen device ;
instructions configured to determine a first desktop to display on the first screen ;
instructions configured to determine a second desktop to display on a second screen of the multi-screen device ;
and instructions configured to respond to the first predetermined input with an output that causes the first desktop to be displayed on the first screen and which also causes the second desktop to be at least one of (i) actually displayed on the second screen and (ii) virtually displayed on the second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081267A1
CLAIM 2
. The method of claim 1 , wherein the first predetermined input comprises at least one of the following : i) a user input (user input) gesture ;
ii) a combination of user input gestures ;
iii) a memory output ;
iv) a response to a programmed condition ;
and v) hardware timers .

US20120081267A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on a first screen of the multi-screen device ;
instructions configured to determine a first desktop to display on the first screen ;
instructions configured to determine a second desktop to display on a second screen of the multi-screen device ;
and instructions configured to respond to the first predetermined input with an output that causes the first desktop to be displayed on the first screen and which also causes the second desktop to be at least one of (i) actually displayed on the second screen and (ii) virtually displayed on the second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081267A1
CLAIM 2
. The method of claim 1 , wherein the first predetermined input comprises at least one of the following : i) a user input (user input) gesture ;
ii) a combination of user input gestures ;
iii) a memory output ;
iv) a response to a programmed condition ;
and v) hardware timers .

US20120081267A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on a first screen of the multi-screen device ;
instructions configured to determine a first desktop to display on the first screen ;
instructions configured to determine a second desktop to display on a second screen of the multi-screen device ;
and instructions configured to respond to the first predetermined input with an output that causes the first desktop to be displayed on the first screen and which also causes the second desktop to be at least one of (i) actually displayed on the second screen and (ii) virtually displayed on the second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081267A1
CLAIM 2
. The method of claim 1 , wherein the first predetermined input comprises at least one of the following : i) a user input (user input) gesture ;
ii) a combination of user input gestures ;
iii) a memory output ;
iv) a response to a programmed condition ;
and v) hardware timers .

US20120081267A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on a first screen of the multi-screen device ;
instructions configured to determine a first desktop to display on the first screen ;
instructions configured to determine a second desktop to display on a second screen of the multi-screen device ;
and instructions configured to respond to the first predetermined input with an output that causes the first desktop to be displayed on the first screen and which also causes the second desktop to be at least one of (i) actually displayed on the second screen and (ii) virtually displayed on the second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081268A1

Filed: 2011-08-31     Issued: 2012-04-05

Launching applications into revealed desktop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Martin Gimpl
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081268A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a first application on a first screen of a multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on the first screen and the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that causes the desktop to be displayed on the first screen and the second screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to display a second application on the first screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that causes the second application to be displayed on the first screen and which also causes the first application to be displayed on the second screen of the multi-screen device .

US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input (user input) gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set (second set, display system) of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set (third set) of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081268A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a first application on a first screen of a multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on the first screen and the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that causes the desktop to be displayed on the first screen and the second screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to display a second application on the first screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that causes the second application to be displayed on the first screen and which also causes the first application to be displayed on the second screen of the multi-screen device .

US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input (user input) gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081268A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a first application on a first screen of a multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on the first screen and the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that causes the desktop to be displayed on the first screen and the second screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to display a second application on the first screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that causes the second application to be displayed on the first screen and which also causes the first application to be displayed on the second screen of the multi-screen device .

US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input (user input) gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081268A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a first application on a first screen of a multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on the first screen and the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that causes the desktop to be displayed on the first screen and the second screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to display a second application on the first screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that causes the second application to be displayed on the first screen and which also causes the first application to be displayed on the second screen of the multi-screen device .

US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input (user input) gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081268A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to display a first application on a first screen of a multi-screen device ;
instructions configured to receive a first predetermined input that represents an instruction to reveal a desktop on the first screen and the second screen of the multi-screen device ;
instructions configured to respond to the first predetermined input with an output that causes the desktop to be displayed on the first screen and the second screen of the multi-screen device ;
instructions configured to receive a second predetermined input that represents an instruction to display a second application on the first screen of the multi-screen device ;
instructions configured to respond to the second predetermined input with an output that causes the second application to be displayed on the first screen and which also causes the first application to be displayed on the second screen of the multi-screen device .

US20120081268A1
CLAIM 20
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a first user input (user input) gesture area configured to receive a first signal indicative of a position on the first screen ;
a second user input gesture area configured to receive a second signal indicative of a position on the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine a received signal based on the input origin of a first or second signal ;
and a second set of instructions configured to determine whether the received signal corresponds to the position of a known application displayed on the first or second display ;
and a third set of instructions configured to launch the known application and display it to the screen from where the received signal originated .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081269A1

Filed: 2011-08-31     Issued: 2012-04-05

Gravity drop

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Alex de Paz
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081269A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual-screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set (first set) of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set (second set, display system) of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081269A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
a hinge configured to connect the first screen with the second screen ;
a first sensor configured to produce a first signal indicative of a position of the hinge ;
a second sensor configured to produce a second signal indicative of a relative position of the first screen and the second screen ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to determine , based at least partially on the first signal , that the hinge has moved from a first position to a second position in which the first and second screens are open ;
a second set of instructions configured to determine , based on the second signal , the relative position of the first and second screens , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and a third set (third set) of instructions configured to automatically control data displayed on the first and second displays after the hinge has opened .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081269A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual-screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081269A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual-screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081269A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual-screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081269A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to determine that a dual-screen device has transitioned from a closed landscape state to an open landscape state ;
instructions configured to determine a relative position of a first screen of the dual-screen device and a second screen of the dual-screen device , wherein the first screen comprises a first display that originally displayed data for a first application when the dual-screen device was in the closed landscape state , and wherein the second screen comprises a second display that was not displaying data from the first application when the dual-screen device was in the closed landscape state ;
and instructions configured to control data displayed on the first and second displays in the open landscape state based on the determined relative position of the first screen and second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081302A1

Filed: 2011-08-31     Issued: 2012-04-05

Multi-screen display control

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Martin Gimpl, Alexander de Paz, Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system (second set) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (second set) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set (first set) of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US20120081302A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on a first screen of the dual-screen device ;
instructions configuration to reference maximization rules for the first application ;
and instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on both the first screen and a second screen of the dual-screen device .

US9645663B2
CLAIM 2
. The display system (second set) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 3
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 4
. The display system (second set) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 5
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 6
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 7
. The display system (second set) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 8
. The display system (second set) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 9
. The display system (second set) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 10
. The display system (second set) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 11
. The display system (second set) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 12
. The display system (second set) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081302A1
CLAIM 3
. The method of claim 2 , wherein the first application is maximized in a dual-landscape output configuration such that a first set of data from the first application is displayed on the display of the first screen and a second set (second set, display system) of data from the first application is displayed on the display of the second screen .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (third set) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081302A1
CLAIM 16
. A dual-screen user device , comprising : a first screen including a first display area ;
a second screen including a second display area ;
and a computer-readable medium having instructions stored thereon that include : a first set of instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on the display of the first screen ;
a second set of instructions configuration to reference maximization rules for the first application ;
and a third set (third set) of instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on the both the first display area and the second display area .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081302A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on a first screen of the dual-screen device ;
instructions configuration to reference maximization rules for the first application ;
and instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on both the first screen and a second screen of the dual-screen device .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081302A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on a first screen of the dual-screen device ;
instructions configuration to reference maximization rules for the first application ;
and instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on both the first screen and a second screen of the dual-screen device .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081302A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on a first screen of the dual-screen device ;
instructions configuration to reference maximization rules for the first application ;
and instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on both the first screen and a second screen of the dual-screen device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081302A1
CLAIM 10
. A non-transitory computer-readable medium having stored thereon instructions that cause a computing system (touchscreen layer, touchscreen display) to execute a method , the instructions comprising : instructions configured to analyze a first user input received at a dual-screen device and determine that the first user input corresponds to instructions to maximize a first application displayed on a first screen of the dual-screen device ;
instructions configuration to reference maximization rules for the first application ;
and instructions configured to invoke the maximization of the first application such that data from the first application is simultaneously displayed on both the first screen and a second screen of the dual-screen device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081304A1

Filed: 2011-08-31     Issued: 2012-04-05

Hardware buttons activated based on focus

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Sanjiv Sirpal, Paul Edward Reeves
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second output) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081304A1
CLAIM 1
. A method for presenting control buttons on a device , comprising : providing a first output on a first screen of a device ;
providing a second output (second portion) on a second screen of the device ;
determining which one of the first and second screens contains information that currently has focus ;
displaying at least a first control button in association with the screen determined to contain information that currently has focus .

US20120081304A1
CLAIM 6
. The method of claim 4 , wherein the second output on the second screen includes a display of information including at least a first item of information , wherein user input (user input) is received selecting that at least a first item of information , wherein the focus is shifted from the first screen to the second screen , and wherein in response to the shift in focus the first control button is presented by the second screen .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081304A1
CLAIM 18
. The computer readable medium of claim 17 , the computer readable instructions further comprising : instructions to receive input from a touch screen (electronic device status display panel) area of one of the first and second screens , wherein the touch screen receiving the input is identified as the one of the first and second screens having a current focus .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120081304A1
CLAIM 18
. The computer readable medium of claim 17 , the computer readable instructions further comprising : instructions to receive input from a touch screen (electronic device status display panel) area of one of the first and second screens , wherein the touch screen receiving the input is identified as the one of the first and second screens having a current focus .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second output) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081304A1
CLAIM 1
. A method for presenting control buttons on a device , comprising : providing a first output on a first screen of a device ;
providing a second output (second portion) on a second screen of the device ;
determining which one of the first and second screens contains information that currently has focus ;
displaying at least a first control button in association with the screen determined to contain information that currently has focus .

US20120081304A1
CLAIM 6
. The method of claim 4 , wherein the second output on the second screen includes a display of information including at least a first item of information , wherein user input (user input) is received selecting that at least a first item of information , wherein the focus is shifted from the first screen to the second screen , and wherein in response to the shift in focus the first control button is presented by the second screen .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120081304A1
CLAIM 6
. The method of claim 4 , wherein the second output on the second screen includes a display of information including at least a first item of information , wherein user input (user input) is received selecting that at least a first item of information , wherein the focus is shifted from the first screen to the second screen , and wherein in response to the shift in focus the first control button is presented by the second screen .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081304A1
CLAIM 6
. The method of claim 4 , wherein the second output on the second screen includes a display of information including at least a first item of information , wherein user input (user input) is received selecting that at least a first item of information , wherein the focus is shifted from the first screen to the second screen , and wherein in response to the shift in focus the first control button is presented by the second screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081304A1
CLAIM 6
. The method of claim 4 , wherein the second output on the second screen includes a display of information including at least a first item of information , wherein user input (user input) is received selecting that at least a first item of information , wherein the focus is shifted from the first screen to the second screen , and wherein in response to the shift in focus the first control button is presented by the second screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081305A1

Filed: 2011-08-31     Issued: 2012-04-05

Swipeable key line

(Original Assignee) Imerj LLC     (Current Assignee) Z124

Rodney W. Schrock
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (touch screens) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set (first set) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081305A1
CLAIM 4
. The method of claim 1 , wherein the touch screen display includes first and second touch screen displays , wherein a first portion (first portion) of the first selectable set of virtual keys is presented by the first touch screen display during a first time interval , wherein a second portion (second portion, usage frequency) of the first selectable set of virtual keys is presented by the second touch screen display during the first time interval , wherein a first portion of the second selectable set of virtual keys is presented by the first touch screen display during a second time interval , wherein a second portion of the second selectable set of virtual keys is presented by the second touch screen display during the second time interval , wherein the second time interval is after the first time interval , and wherein at least some portion of the second time interval is after the first input is received from the user .

US20120081305A1
CLAIM 9
. The method of claim 8 , wherein the first selectable set of virtual keys is presented as a row of virtual keys , wherein the second selectable set of virtual keys is presented as a row of virtual keys , wherein the third selectable set of virtual keys is presented as a row of virtual keys , wherein the first input is in a first direction (first mode) along the row of virtual keys of the first selectable set of virtual keys and causes the second selectable set of virtual keys to be presented , wherein the second input is in the first direction along the row of virtual keys of the second selectable set of virtual keys and causes the third selectable set of virtual keys to be presented .

US20120081305A1
CLAIM 16
. A computer readable medium having stored thereon computer-executable instructions , the computer executable instructions causing a processor to execute a method for displaying selectable virtual key sets , the computer executable instructions comprising : instructions to display a first set (first set) of virtual keys on a touch screen ;
instructions to determine whether at least a first touch screen input has been received in an area of the touch screen corresponding to the displayed first selectable set of virtual keys ;
instructions to display a second selectable set of virtual keys on the touch screen in place of the first selectable set of virtual keys in response to the at least a first touch screen input .

US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20120081305A1
CLAIM 9
. The method of claim 8 , wherein the first selectable set of virtual keys is presented as a row of virtual keys , wherein the second selectable set of virtual keys is presented as a row of virtual keys , wherein the third selectable set of virtual keys is presented as a row of virtual keys , wherein the first input is in a first direction (first mode) along the row of virtual keys of the first selectable set of virtual keys and causes the second selectable set of virtual keys to be presented , wherein the second input is in the first direction along the row of virtual keys of the second selectable set of virtual keys and causes the third selectable set of virtual keys to be presented .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch screens) .
US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch screens) .
US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch screens) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch screens) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081305A1
CLAIM 4
. The method of claim 1 , wherein the touch screen display includes first and second touch screen displays , wherein a first portion (first portion) of the first selectable set of virtual keys is presented by the first touch screen display during a first time interval , wherein a second portion (second portion, usage frequency) of the first selectable set of virtual keys is presented by the second touch screen display during the first time interval , wherein a first portion of the second selectable set of virtual keys is presented by the first touch screen display during a second time interval , wherein a second portion of the second selectable set of virtual keys is presented by the second touch screen display during the second time interval , wherein the second time interval is after the first time interval , and wherein at least some portion of the second time interval is after the first input is received from the user .

US20120081305A1
CLAIM 9
. The method of claim 8 , wherein the first selectable set of virtual keys is presented as a row of virtual keys , wherein the second selectable set of virtual keys is presented as a row of virtual keys , wherein the third selectable set of virtual keys is presented as a row of virtual keys , wherein the first input is in a first direction (first mode) along the row of virtual keys of the first selectable set of virtual keys and causes the second selectable set of virtual keys to be presented , wherein the second input is in the first direction along the row of virtual keys of the second selectable set of virtual keys and causes the third selectable set of virtual keys to be presented .

US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch screens) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120081305A1
CLAIM 19
. The computer readable medium of claim 18 , the computer executable instructions further comprising : instructions to display the first selectable set of virtual keys on a touch screen that includes first and second touch screens (display screen) during a first time interval , wherein a first portion of the first selectable set of virtual keys is displayed on the first touch screen and a second portion of the first selectable set of virtual keys is displayed on the second touch screen during the first time interval ;
instructions to display the second selectable set of virtual keys on the touch screen that includes first and second touch screens during a second time interval , wherein a first portion of the second selectable set of virtual keys is displayed on the first touch screen and a second portion of the second selectable set of virtual keys is displayed on the second touch screen during the second time interval , wherein the first time interval is before the first touch screen input is received , and wherein the second time interval is after the first touch screen input is received .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081305A1
CLAIM 4
. The method of claim 1 , wherein the touch screen display includes first and second touch screen displays , wherein a first portion of the first selectable set of virtual keys is presented by the first touch screen display during a first time interval , wherein a second portion (second portion, usage frequency) of the first selectable set of virtual keys is presented by the second touch screen display during the first time interval , wherein a first portion of the second selectable set of virtual keys is presented by the first touch screen display during a second time interval , wherein a second portion of the second selectable set of virtual keys is presented by the second touch screen display during the second time interval , wherein the second time interval is after the first time interval , and wherein at least some portion of the second time interval is after the first input is received from the user .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120081305A1
CLAIM 4
. The method of claim 1 , wherein the touch screen display includes first and second touch screen displays , wherein a first portion of the first selectable set of virtual keys is presented by the first touch screen display during a first time interval , wherein a second portion (second portion, usage frequency) of the first selectable set of virtual keys is presented by the second touch screen display during the first time interval , wherein a first portion of the second selectable set of virtual keys is presented by the first touch screen display during a second time interval , wherein a second portion of the second selectable set of virtual keys is presented by the second touch screen display during the second time interval , wherein the second time interval is after the first time interval , and wherein at least some portion of the second time interval is after the first input is received from the user .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20130021289A1

Filed: 2011-07-19     Issued: 2013-01-24

Touch sensitive displays

(Original Assignee) Apple Inc     (Current Assignee) Apple Inc

Wei Chen, Steven P. Hotelling, John Z. Zhong, Shih-Chang Chang, Stephen S. Poon
US9645663B2
CLAIM 1
. A display system (control signals) for an electronic device (electronic device) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 2
. The display system (control signals) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 3
. The display system (control signals) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 4
. The display system (control signals) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 5
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 6
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 7
. The display system (control signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 8
. The display system (control signals) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 9
. The display system (control signals) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 10
. The display system (control signals) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 11
. The display system (control signals) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US9645663B2
CLAIM 12
. The display system (control signals) according to claim 9 , wherein the display screen comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20130021289A1
CLAIM 7
. The display defined in claim 6 further comprising : control lines that are coupled to the organic light-emitting diodes ;
and display and touch sensor control circuitry that is configured to generate control signals (display system, electronic device status display panel) that are conveyed over the control lines to the light-emitting diodes and that is configured to gather touch sensor capacitance signals from the patterned transparent conductive structures on the thin-film encapsulation layer and from at least some of the control lines .

US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (front surface) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface (holding pattern) of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (front surface) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130021289A1
CLAIM 24
. An electronic device (electronic device) , comprising : a housing ;
components in the housing ;
and a display mounted to a front surface (holding pattern) of the housing , wherein the display has an active area and an inactive peripheral area , and wherein at least one edge of the display is bent along a bend axis that lies within the active area so that a bent edge portion of the display that includes part of the active area and part of the inactive area is located on a sidewall of the housing .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN102713822A

Filed: 2011-06-10     Issued: 2012-10-03

信息输入装置、信息输入方法以及程序

(Original Assignee) Panasonic Corp     (Current Assignee) Panasonic Corp

池田洋一, 山内真树, 小岛良宏, 高桥知成, 原田久美
US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items (的位置) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
CN102713822A
CLAIM 2
. 如权利要求I所述的信息输入装置, 所述触摸操作识别部,作为识别结果输出操作者进行的触摸操作的类别和表示所述触摸操作的大小的操作量,被识别出的所述触摸操作的类别是如下操作中的某一个:输入操作者指定的位置 (information items) 的定点操作以及指示执行事先规定的特定处理的多个类别的手势操作。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (第一触) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN102713822A
CLAIM 7
. 如权利要求5所述的信息输入装置, 所述触摸操作识别条件存储部,与所述触摸传感器的操作面上被定义的多个部分区域的每一个部分区域相对应地存储有所述多个触摸操作识别条件, 所述触摸操作判断部,判断所述触摸特征量是否满足用于识别所述多个触摸操作的类别中的第一触 (touchscreen area) 摸操作的触摸操作识别条件,该触摸操作识别条件是与所述多个部分区域中的包含所述触摸操作开始位置的部分区域相对应地存储的所述多个触摸操作识别条件中的一个,在判断为不满足的情况下,判断为所述触摸操作的类别不是所述第一触摸操作。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120293456A1

Filed: 2011-06-10     Issued: 2012-11-22

Information input apparatus, information input method, and program

(Original Assignee) Panasonic Corp     (Current Assignee) Panasonic Intellectual Property Corp of America

Yoichi Ikeda, Masaki Yamauchi, Yoshihiro Kojima, Tomonari Takahashi, Kumi Harada
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first contact) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (first contact) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120293456A1
CLAIM 24
. The information input apparatus according to claim 21 , wherein the touch operation start position detecting unit is configured to detect , as the touch operation start position , a touch position at which the operator first contact (first set, second set, holding pattern) s the touch sensor with the finger .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (operation time) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120293456A1
CLAIM 25
. The information input apparatus according to claim 22 , further comprising a touch information series storage unit configured to store the touch information for a predetermined certain period of time , wherein the touch operation recognizing unit includes : a touch feature amount calculation unit configured to calculate , as a touch feature amount , at least one of a touch operation time (operating system status bar) , a touch movement distance , a touch movement speed , a touch movement acceleration , and a touch movement direction , using the touch information stored in the touch information series storage unit ;
and a touch operation determining unit configured to determine the type of touch operation using the touch feature amount , based on the touch operation recognition conditions stored in the touch operation recognition condition storage unit in association with each of the touch operation start positions .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items (operation type) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120293456A1
CLAIM 35
. The information input apparatus according to claim 34 , further comprising a touch operation type (information items) frequency storage unit configured to store a type frequency that is a frequency for each of the types of touch operation recognized by the touch operation recognizing unit , in association with each of the subregions , wherein the recognition priority level determining unit is configured to determine the recognition priority level so that a touch operation of a type having a higher type frequency has a higher recognition priority level .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first contact) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120293456A1
CLAIM 24
. The information input apparatus according to claim 21 , wherein the touch operation start position detecting unit is configured to detect , as the touch operation start position , a touch position at which the operator first contact (first set, second set, holding pattern) s the touch sensor with the finger .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (threshold value) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first contact) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120293456A1
CLAIM 24
. The information input apparatus according to claim 21 , wherein the touch operation start position detecting unit is configured to detect , as the touch operation start position , a touch position at which the operator first contact (first set, second set, holding pattern) s the touch sensor with the finger .

US20120293456A1
CLAIM 34
. The information input apparatus according to claim 27 , further comprising : a recognition priority level determining unit configured to determine a recognition priority level indicating a degree of recognition for each of the types of touch operation , in association with each of the subregions ;
and a touch operation recognition condition update unit configured to update , according to the recognition priority level determined in association with , among the subregions , a third subregion , the touch operation recognition conditions stored in association with the third subregion , wherein the touch operation recognition condition update unit is configured to update a threshold value (s hand) included in the touch operation recognition conditions so that a type of touch operation having a higher recognition priority level is recognized more easily .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120154294A1

Filed: 2010-12-17     Issued: 2012-06-21

Using movement of a computing device to enhance interpretation of input events produced when interacting with the computing device

(Original Assignee) Microsoft Corp     (Current Assignee) Microsoft Technology Licensing LLC

Kenneth P. Hinckley, Michel Pahud, Wenqi Shen
US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120154294A1
CLAIM 4
. The computing device of claim 1 , wherein said at least one movement-type input mechanism comprises one (operating system status bar) or more of an accelerometer and a gyro device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (light sensor) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120154294A1
CLAIM 8
. The computing device of claim 6 , wherein the IBSM is configured to also conclude that said at least part of the input action is unintentional based on an output of a light sensor (thermal sensors, receiving touch, s thermal sensors) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084697A1

Filed: 2010-11-17     Issued: 2012-04-05

User interface with independent drawer control

(Original Assignee) Flextronics Innovative Development LLC     (Current Assignee) Z124

Paul E. Reeves
US9645663B2
CLAIM 1
. A display system (first location) for an electronic device comprising : a touch-sensitive display screen (status bar) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 2
. The display system (first location) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 3
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 4
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 5
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (status bar) .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 6
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (status bar) .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 7
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 8
. The display system (first location) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 9
. The display system (first location) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 10
. The display system (first location) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 11
. The display system (first location) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US9645663B2
CLAIM 12
. The display system (first location) according to claim 9 , wherein the display screen (status bar) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084697A1
CLAIM 3
. The method according to claim 2 , wherein when the gesture input is received at a first location (display system) , the first drawer is opened , and wherein when the gesture input is received at a second location , the second drawer is opened , and wherein when the gesture is received at a third location , both the first drawer and the second drawers are opened .

US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (status bar) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (status bar) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (status bar) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084697A1
CLAIM 9
. The method according to claim 8 , wherein the graphical portion comprises at least a portion of a status bar (display screen, screen mode, touchscreen display) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084735A1

Filed: 2010-11-17     Issued: 2012-04-05

Gesture controls for multi-screen user interface

(Original Assignee) Flextronics Innovative Development LLC     (Current Assignee) Z124

Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (external display) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084735A1
CLAIM 12
. The method according to claim 11 , wherein the handheld computing device is in operative communication with an external display (touchscreen layer, touchscreen area) , and the method further comprises relocating the screen from the first display to the external display .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (single display) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084735A1
CLAIM 4
. The method according to claim 3 , wherein the screen occupies a single display (operating system status bar) and said modifying includes moving the screen from the first display to the second display .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120084735A1
CLAIM 16
. The method according to claim 14 , wherein the touch sensitive device is a touch screen (electronic device status display panel) display including at least one of the first or second displays .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120084735A1
CLAIM 16
. The method according to claim 14 , wherein the touch sensitive device is a touch screen (electronic device status display panel) display including at least one of the first or second displays .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (external display) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084735A1
CLAIM 12
. The method according to claim 11 , wherein the handheld computing device is in operative communication with an external display (touchscreen layer, touchscreen area) , and the method further comprises relocating the screen from the first display to the external display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (external display) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084735A1
CLAIM 12
. The method according to claim 11 , wherein the handheld computing device is in operative communication with an external display (touchscreen layer, touchscreen area) , and the method further comprises relocating the screen from the first display to the external display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084738A1

Filed: 2010-11-17     Issued: 2012-04-05

User interface with stacked application management

(Original Assignee) Flextronics Innovative Development LLC     (Current Assignee) Z124

Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084738A1
CLAIM 15
. A method of controlling a handheld computing device comprising : receiving a first portion (first portion) of a gesture input ;
targeting the first portion of the gesture input to a first display having a first actively displayed screen corresponding to a first application ;
maintaining the first actively displayed screen in the first display in response to the first portion of the gesture input ;
receiving a second portion (second portion, usage frequency) of the gesture input ;
targeting a different screen than the actively displayed screen in the first display with the second portion ;
and wherein the different screen undergoes a change in position with respect to the first display and at least a second display in response to the second portion of the gesture input .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084738A1
CLAIM 4
. The method according to claim 3 , wherein the underlying screen comprises one (operating system status bar) of a desktop screen and an application screen corresponding to a second application logically associated with the first display such that the second application belongs to the first application stack .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084738A1
CLAIM 15
. A method of controlling a handheld computing device comprising : receiving a first portion (first portion) of a gesture input ;
targeting the first portion of the gesture input to a first display having a first actively displayed screen corresponding to a first application ;
maintaining the first actively displayed screen in the first display in response to the first portion of the gesture input ;
receiving a second portion (second portion, usage frequency) of the gesture input ;
targeting a different screen than the actively displayed screen in the first display with the second portion ;
and wherein the different screen undergoes a change in position with respect to the first display and at least a second display in response to the second portion of the gesture input .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084738A1
CLAIM 15
. A method of controlling a handheld computing device comprising : receiving a first portion of a gesture input ;
targeting the first portion of the gesture input to a first display having a first actively displayed screen corresponding to a first application ;
maintaining the first actively displayed screen in the first display in response to the first portion of the gesture input ;
receiving a second portion (second portion, usage frequency) of the gesture input ;
targeting a different screen than the actively displayed screen in the first display with the second portion ;
and wherein the different screen undergoes a change in position with respect to the first display and at least a second display in response to the second portion of the gesture input .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084738A1
CLAIM 15
. A method of controlling a handheld computing device comprising : receiving a first portion of a gesture input ;
targeting the first portion of the gesture input to a first display having a first actively displayed screen corresponding to a first application ;
maintaining the first actively displayed screen in the first display in response to the first portion of the gesture input ;
receiving a second portion (second portion, usage frequency) of the gesture input ;
targeting a different screen than the actively displayed screen in the first display with the second portion ;
and wherein the different screen undergoes a change in position with respect to the first display and at least a second display in response to the second portion of the gesture input .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120084736A1

Filed: 2010-11-17     Issued: 2012-04-05

Gesture controlled screen repositioning for one or more displays

(Original Assignee) Flextronics Innovative Development LLC     (Current Assignee) Z124

Sanjiv Sirpal
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120084736A1
CLAIM 2
. The method as recited in claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

US20120084736A1
CLAIM 12
. The method as recited in claim 11 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US20120084736A1
CLAIM 14
. The method as recited in claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion (first portion) of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20120084736A1
CLAIM 2
. The method as recited in claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (single display) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120084736A1
CLAIM 14
. The method as recited in claim 1 , wherein the plurality of displays comprise separate portions of a single display (operating system status bar) , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion of the single display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (first touch screen) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120084736A1
CLAIM 2
. The method as recited in claim 1 , wherein the first gesture input is a drag gesture in a first direction (first mode) .

US20120084736A1
CLAIM 12
. The method as recited in claim 11 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US20120084736A1
CLAIM 14
. The method as recited in claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion (first portion) of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120084736A1
CLAIM 12
. The method as recited in claim 11 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (first touch screen) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084736A1
CLAIM 12
. The method as recited in claim 11 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US20120084736A1
CLAIM 14
. The method as recited in claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (first touch screen) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120084736A1
CLAIM 12
. The method as recited in claim 11 , wherein said first touch sensitive portion is associated with the first display to comprise a first touch screen (user input) display , and wherein said second touch sensitive portion is associated with a second touch sensitive display to comprise a second touch sensitive display .

US20120084736A1
CLAIM 14
. The method as recited in claim 1 , wherein the plurality of displays comprise separate portions of a single display , wherein the first display corresponds with a first portion of the single display and the second display corresponds with a second portion (second portion, usage frequency) of the single display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120081277A1

Filed: 2010-11-17     Issued: 2012-04-05

Multi-screen user interface with orientation based control

(Original Assignee) Flextronics Innovative Development LLC     (Current Assignee) Z124

Alex de Paz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set (different one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120081277A1
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .

US20120081277A1
CLAIM 12
. The method according to claim 7 , further comprising : receiving a gesture input at the handheld computing device when the handheld computing device is in one of the second orientation and the third orientation ;
altering one of the first display and second display to display a different one (first set) of the first application and the second application in response to the receiving .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20120081277A1
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120081277A1
CLAIM 1
. A method of controlling a plurality of displays of a handheld computing device , comprising : disposing the handheld computing device in a first orientation ;
displaying a first screen of a first application on a first display of the plurality of displays when the handheld computing device is in the first orientation ;
positioning the handheld computing device in a second orientation different than the first orientation by moving the handheld computing device from the first orientation in a first direction (first mode) ;
and modifying the plurality of displays such that the first application is displayed on the first display and a second display of the plurality of displays in response to the positioning .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN101996043A

Filed: 2010-08-24     Issued: 2011-03-30

执行移动终端的热键功能的装置和方法

(Original Assignee) FANTAI Co Ltd     (Current Assignee) Pan Thai Co.,Ltd.

金龙植, 金知炯, 韩尚根, 李后东, 洪在万, 尹智熙
US9645663B2
CLAIM 1
. A display system for an electronic device (功能的方法) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (功能的方法) for the gestural hardware on how a multi-touch input will be processed .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (功能的方法) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 13
. The electronic device (功能的方法) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function (选择功能) to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
CN101996043A
CLAIM 5
. 根据权利要求1所述的装置,其中,所述热键功能是菜单选择功能 (virtual bezel region function)

CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 14
. An electronic device (功能的方法) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 15
. The electronic device (功能的方法) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (功能的方法) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (功能的方法) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (功能的方法) having a touchscreen display , the method comprising : receiving a heat signature (第三热) from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN101996043A
CLAIM 13
. 根据权利要求2所述的装置,其中,所述运动方向是向上、向下、向左、向右中的一 个,并且所述控制器根据检测到的所述移动终端的向上、向下、向左、向右运动方向分别执 行第一热键功能、第二热键功能、第三热 (heat signature) 键功能、第四热键功能。

CN101996043A
CLAIM 16
. 一种执行移动终端的热键功能的方法 (electronic device, handheld interactive electronic device) ,该方法包括以下步骤: 检测触摸;检测所述移动终端的运动;如果检测到了触摸并且检测到了所述移动终端的运动,则执行第一热键功能。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN102667662A

Filed: 2010-07-07     Issued: 2012-09-12

柔性显示器的交互技术

(Original Assignee) 罗尔·弗特加尔; 贾斯廷·利; 伊夫斯·比哈尔; 皮查亚·帕通古尔     

罗尔·弗特加尔, 贾斯廷·利, 伊夫斯·比哈尔, 皮查亚·帕通古尔
US9645663B2
CLAIM 1
. A display system for an electronic device (包含下列步骤) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (数字上) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上 (second mode) 传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (包含下列步骤) for the gestural hardware on how a multi-touch input will be processed .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (包含下列步骤) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 13
. The electronic device (包含下列步骤) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 14
. An electronic device (包含下列步骤) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (数字上) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上 (second mode) 传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 15
. The electronic device (包含下列步骤) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (数字上) of response in the virtual bezel region .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上 (second mode) 传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (包含下列步骤) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (包含下列步骤) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (的使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN102667662A
CLAIM 8
. 如权利要求6所述的方法,其中,至所述计算机系统的所述输入导致在所述计算机系统上执行命令,以及其中所述命令选自下列项所组成的组: a、激活,其中所述计算机系统的软件及显示器从休眠醒来、使屏幕保护程序或能量减少状态失效、或开启广告活动,以及 b、去激活,其中所述计算机的该软件及显示器休眠,开启屏幕保护程序或能量减少状态、或禁止广告活动,以及 C、移近或放大,其中放大或移近呈现在所述显示器上的文档或文档的图像或内容,以及 d、移远或缩小,其中缩小或移远呈现在所述显示器上的文件或文档的图像或内容,以及 e、组织,其中以一种方式数字式组织或分类与所述显示器表面(一个或多个)关联或显示在所述显示器表面(一个或多个)上的文件(一个或多个)、数字信息、文本、图像、或其他计算机内容的某种性质,该方式匹配该实体计算机系统的性质,如实体顺序,以及 f、滚动,其中文件、文档、或应用的图像或内容的片段呈现在显示器上,所述片段先前未呈现过,且所述片段与先前呈现在所述显示器上的所述图像或内容的片段在空间上连续,以及g、页面往下,其中导航目前呈现于显示器上的文件的所述内容的该部分之后的文件内容片段,从而使得所述之后的部分在所述显示器上呈现,以及 h、页面往上,其中导航目前呈现于显示器上的文件的所述内容的该部分之前的文件内容片段,从而使得所述之前的部分呈现在所述显示器上,以及 i、导航,其中在所述计算机系统上的文件内容的任意部分、或某在线内容、超链接、或菜单被导航,从而使得它致使所述关联的内容呈现在显示器上,以及 j、页面往后或往前,其中在目前呈现在显示器上的所述内容的该部分之前或之后的文件内容部分、或某在线内容、网页或超链接被导航,从而使得它致使所述内容呈现在所述显示器上,以及 打开、保存或关闭,其中打开或关闭在所述计算机系统上的某文件或数字信息、将其读入存储器中、或读出至永久性储存媒介,以及 k、移动、复制或粘贴,其中文件内容部分、图像、文本或与所述计算机系统或显示器关联的某其他数字信息被转移至另一个计算机系统或显示器、或所述相同计算机系统或显示器上的某个不同逻辑位置,以及 选择,其中选择呈现在显示器上的图形对象使其变成至该关联计算机系统的后续动作、输入、或命令的接收者,以及 I、点击,其中将插入点或光标移动至显示器上的特定位置,选择或激活在所述显示器上的所述位置下的图形对象,以及 m、擦除,其中从所述显示器和/或从所述计算机系统的存储器擦除选定信息或图像、或与计算机系统上的所述图像关联的内容,以及 η、回放控制,其中以某个速度播放包括所述计算机系统上的图形动画、视频、声音或音乐内容的多媒体文件,以及其中由所述输入可选地控制所述速度,以及 O、连接,其中所述计算机系统通过计算机网络连接至另一计算机系统、在线服务器、通信工具或社交网站,以及 P、共享,其中将所述计算机系统上的信息放置在计算机服务器上以便与连接至所述服务器的其他用户共享所述信息,以及 q、在线状态,其中与计算机服务器共享关于所述用户对所述计算机系统的使用 (usage frequency) 、或所述用户的某随机状态或属性信息,以便与连接至所述服务器的其他用户共享所述信息,以及 r、通信,其中所述计算机系统用作通信装置,以及 S、做广告,其中在显示器上呈现广告,以及 t、订购,其中将在显示器上选定的饮料或食品订单连同所述订单的付款被处理并传递至供应商、自动贩卖机、再填充站、或配送器,以及 U、赌博与游戏,其中所述计算机系统用来玩游戏、机会推广游戏、乐等,以及 V、分段显示,其中所述计算机系统横跨多个显示器显示图像,以及 W、验证,其中所述计算机系统提供对特定用户或所述计算机系统上的信息使用的访问。

CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (包含下列步骤) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (的使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN102667662A
CLAIM 8
. 如权利要求6所述的方法,其中,至所述计算机系统的所述输入导致在所述计算机系统上执行命令,以及其中所述命令选自下列项所组成的组: a、激活,其中所述计算机系统的软件及显示器从休眠醒来、使屏幕保护程序或能量减少状态失效、或开启广告活动,以及 b、去激活,其中所述计算机的该软件及显示器休眠,开启屏幕保护程序或能量减少状态、或禁止广告活动,以及 C、移近或放大,其中放大或移近呈现在所述显示器上的文档或文档的图像或内容,以及 d、移远或缩小,其中缩小或移远呈现在所述显示器上的文件或文档的图像或内容,以及 e、组织,其中以一种方式数字式组织或分类与所述显示器表面(一个或多个)关联或显示在所述显示器表面(一个或多个)上的文件(一个或多个)、数字信息、文本、图像、或其他计算机内容的某种性质,该方式匹配该实体计算机系统的性质,如实体顺序,以及 f、滚动,其中文件、文档、或应用的图像或内容的片段呈现在显示器上,所述片段先前未呈现过,且所述片段与先前呈现在所述显示器上的所述图像或内容的片段在空间上连续,以及g、页面往下,其中导航目前呈现于显示器上的文件的所述内容的该部分之后的文件内容片段,从而使得所述之后的部分在所述显示器上呈现,以及 h、页面往上,其中导航目前呈现于显示器上的文件的所述内容的该部分之前的文件内容片段,从而使得所述之前的部分呈现在所述显示器上,以及 i、导航,其中在所述计算机系统上的文件内容的任意部分、或某在线内容、超链接、或菜单被导航,从而使得它致使所述关联的内容呈现在显示器上,以及 j、页面往后或往前,其中在目前呈现在显示器上的所述内容的该部分之前或之后的文件内容部分、或某在线内容、网页或超链接被导航,从而使得它致使所述内容呈现在所述显示器上,以及 打开、保存或关闭,其中打开或关闭在所述计算机系统上的某文件或数字信息、将其读入存储器中、或读出至永久性储存媒介,以及 k、移动、复制或粘贴,其中文件内容部分、图像、文本或与所述计算机系统或显示器关联的某其他数字信息被转移至另一个计算机系统或显示器、或所述相同计算机系统或显示器上的某个不同逻辑位置,以及 选择,其中选择呈现在显示器上的图形对象使其变成至该关联计算机系统的后续动作、输入、或命令的接收者,以及 I、点击,其中将插入点或光标移动至显示器上的特定位置,选择或激活在所述显示器上的所述位置下的图形对象,以及 m、擦除,其中从所述显示器和/或从所述计算机系统的存储器擦除选定信息或图像、或与计算机系统上的所述图像关联的内容,以及 η、回放控制,其中以某个速度播放包括所述计算机系统上的图形动画、视频、声音或音乐内容的多媒体文件,以及其中由所述输入可选地控制所述速度,以及 O、连接,其中所述计算机系统通过计算机网络连接至另一计算机系统、在线服务器、通信工具或社交网站,以及 P、共享,其中将所述计算机系统上的信息放置在计算机服务器上以便与连接至所述服务器的其他用户共享所述信息,以及 q、在线状态,其中与计算机服务器共享关于所述用户对所述计算机系统的使用 (usage frequency) 、或所述用户的某随机状态或属性信息,以便与连接至所述服务器的其他用户共享所述信息,以及 r、通信,其中所述计算机系统用作通信装置,以及 S、做广告,其中在显示器上呈现广告,以及 t、订购,其中将在显示器上选定的饮料或食品订单连同所述订单的付款被处理并传递至供应商、自动贩卖机、再填充站、或配送器,以及 U、赌博与游戏,其中所述计算机系统用来玩游戏、机会推广游戏、乐等,以及 V、分段显示,其中所述计算机系统横跨多个显示器显示图像,以及 W、验证,其中所述计算机系统提供对特定用户或所述计算机系统上的信息使用的访问。

CN102667662A
CLAIM 14
. 一种从供应商或自动贩卖机传送推广素材至客户的交互式食品或饮料容器的方法,包含下列步骤 (electronic device) : a、可选地,由所述供应商或自动贩卖机通过在所述供应商或自动贩卖机的阈值距离内的所述容器来识别所述容器,以及 b、可选地,由所述客户通过设置在所述容器上的用户界面来联系所述供应商或自动贩卖机来识别所述容器,以及 C、可选地,由所述客户向所述供应商或自动贩卖机下订单来识别所述容器,以及其中 d、所述供应商或自动贩卖机基于时机、所述客户的历史订单特征、或所述客户的订单特征来选择所述推广素材,以及 e、通过无线或有线网络数字上传所述推广素材至所述容器,以及 f、在所述容器上显示或播放所述推广素材。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
KR20120093148A

Filed: 2010-07-07     Issued: 2012-08-22

플렉시블 디스플레이를 위한 상호작용 기법

(Original Assignee) 로엘 버티갈; 이브 베하; 저스틴 이; 피차야 푸톤굴     

로엘 버티갈, 이브 베하, 저스틴 이, 피차야 푸톤굴
US9645663B2
CLAIM 1
. A display system (베이스) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 2
. The display system (베이스) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 3
. The display system (베이스) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 4
. The display system (베이스) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 5
. The display system (베이스) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 6
. The display system (베이스) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 7
. The display system (베이스) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 8
. The display system (베이스) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 9
. The display system (베이스) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 10
. The display system (베이스) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 11
. The display system (베이스) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

US9645663B2
CLAIM 12
. The display system (베이스) according to claim 9 , wherein the display screen comprises an electronic device status display panel (와이핑) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스 (display system) 를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .

KR20120093148A
CLAIM 6
센서를 통해 곡선형 디스플레이 표면과의 수동적인 상호작용을 감지함으로써 컴퓨터 시스템에 입력을 제공하는 방법으로서 , 상기 상호작용은 a . 잡기(Holding)(한 손 또는 두 손으로 상기 곡선형 디스플레이 표면을 잡는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할을 한다) ;
b . 콜로케이팅(collocating) 또는 쌓기(stacking)(복수의 곡선형 디스플레이를 콜로케이팅하거나 , 콜레이팅하거나 , 또는 쌓는 것은 각각의 디스플레이를 구성하는 하나의 콜로케이팅된 디스플레이 표면을 만들고 , 후속적인 입력은 상기 더 큰 디스플레이 표면상에서 오퍼레이팅한다) ;
c . 터닝(Turning) 또는 회전시키기(Rotating)(상기 곡선형 디스플레이를 하나의 축을 중심으로 회전시키는 것은 상기 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
d . 스월링(Swirling)(상기 곡선형 디스플레이의 몇몇 축과 평행하지만 동심은 아닌 하나의 축을 중심으로 상기 곡선형 디스플레이를 이동시키는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력 수단으로서 역할한다) ;
e . 비평면 스트립 스와이핑 (electronic device status display panel) (Non-planar Strip Swiping)(곡선형 디바이스의 최상부 또는 하단부를 따라 , 또는 상기 디스플레이 바로 위 또는 아래로 하나 이상의 손가락을 이동시키는 것은 상기 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
f . 세 손가락 비평면 핀칭(Three-finger Non-planar Pinching)(곡선형 디스플레이 상에 임계 거리(proximity) 이내에 세 손가락을 놓는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
g . 피닝(Pinning) 및 스와이핑(Swiping)(곡선형 디스플레이 상의 고정된 위치에 하나의 손가락을 놓고 , 후속하여 제2 손가락을 상기 디스플레이 상에 놓고 , 상기 제2 손가락이 그 다음 상기 제1 손가락으로부터 멀어지도록 이동되는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
h . 변형하기(Deforiming)(하나의 위치에서 곡선형 디스플레이를 변형시키는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
i . 문지르기(Rubing)(손 , 손가락 또는 몇몇 도구가 디스플레이 표면 위에서 사인곡선형 패턴으로 이동되는 , 곡선형 디스플레이 상에 문지르는 동작을 제공하는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
j . 기울이기(Tiling)(곡선형 디스플레이를 기울이는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
k . 플리킹(Flicking) 또는 토스하기(Tossing)(곡선형 디스플레이를 빠르게 기울이고 , 정지시키고 , 옵션으로서 그 대략적인 원래의 방향으로 되돌리는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
l . 가만히 두기(Resting)(전자 식품 또는 음료 용기를 하나의 표면 위에 놓고 손을 떼는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
m . 마시기 , 채우기 , 및 유체 수위(전자 식품 또는 음료 용기를 입으로 가져가는 것 ;
상기 용기로부터 음료를 마시는 것 ;
또는 상기 용기를 채우는 것으로 이루어진 그룹으로부터 선택된 동작은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
n . 열기 및 닫기(전자 식품 또는 음료 용기의 뚜껑을 여는 것 및 닫는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
o . 멀티-디바이스 붓기(Pouring)(하나의 전자 식품 또는 음료 용기를 제2의 상기 용기 위에서 잡고 , 후속하여 상기 제1 용기를 기울이는 것은 상기 용기들 중 하나 또는 모두에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
p . 지문 스캐닝(곡선형 디스플레이 표면의 지정된 부분에 사용자의 하나 이상의 손가락을 놓는 것은 연관된 지문이 상기 곡선형 디스플레이 표면상의 정보에 대한 상기 사용자의 접근 권한을 확인할 목적으로 분석되게 한다) ;
및 q . 얼굴 인식(사용자의 얼굴은 상기 용기상의 정보에 대한 상기 사용자의 접근 권한을 확인할 목적으로 전자 식품 또는 음료 용기에 의해 식별된다)으로 이루어진 그룹으로부터 선택된 것을 특징으로 하는 센서를 통해 곡선형 디스플레이 표면과의 수동적인 상호작용을 감지함으로써 컴퓨터 시스템에 입력을 제공하는 방법 .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (와이핑) and the pre-defined set of touch-based soft buttons are in a hidden mode .
KR20120093148A
CLAIM 6
센서를 통해 곡선형 디스플레이 표면과의 수동적인 상호작용을 감지함으로써 컴퓨터 시스템에 입력을 제공하는 방법으로서 , 상기 상호작용은 a . 잡기(Holding)(한 손 또는 두 손으로 상기 곡선형 디스플레이 표면을 잡는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할을 한다) ;
b . 콜로케이팅(collocating) 또는 쌓기(stacking)(복수의 곡선형 디스플레이를 콜로케이팅하거나 , 콜레이팅하거나 , 또는 쌓는 것은 각각의 디스플레이를 구성하는 하나의 콜로케이팅된 디스플레이 표면을 만들고 , 후속적인 입력은 상기 더 큰 디스플레이 표면상에서 오퍼레이팅한다) ;
c . 터닝(Turning) 또는 회전시키기(Rotating)(상기 곡선형 디스플레이를 하나의 축을 중심으로 회전시키는 것은 상기 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
d . 스월링(Swirling)(상기 곡선형 디스플레이의 몇몇 축과 평행하지만 동심은 아닌 하나의 축을 중심으로 상기 곡선형 디스플레이를 이동시키는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력 수단으로서 역할한다) ;
e . 비평면 스트립 스와이핑 (electronic device status display panel) (Non-planar Strip Swiping)(곡선형 디바이스의 최상부 또는 하단부를 따라 , 또는 상기 디스플레이 바로 위 또는 아래로 하나 이상의 손가락을 이동시키는 것은 상기 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
f . 세 손가락 비평면 핀칭(Three-finger Non-planar Pinching)(곡선형 디스플레이 상에 임계 거리(proximity) 이내에 세 손가락을 놓는 것은 상기 곡선형 디스플레이에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
g . 피닝(Pinning) 및 스와이핑(Swiping)(곡선형 디스플레이 상의 고정된 위치에 하나의 손가락을 놓고 , 후속하여 제2 손가락을 상기 디스플레이 상에 놓고 , 상기 제2 손가락이 그 다음 상기 제1 손가락으로부터 멀어지도록 이동되는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
h . 변형하기(Deforiming)(하나의 위치에서 곡선형 디스플레이를 변형시키는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
i . 문지르기(Rubing)(손 , 손가락 또는 몇몇 도구가 디스플레이 표면 위에서 사인곡선형 패턴으로 이동되는 , 곡선형 디스플레이 상에 문지르는 동작을 제공하는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
j . 기울이기(Tiling)(곡선형 디스플레이를 기울이는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
k . 플리킹(Flicking) 또는 토스하기(Tossing)(곡선형 디스플레이를 빠르게 기울이고 , 정지시키고 , 옵션으로서 그 대략적인 원래의 방향으로 되돌리는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
l . 가만히 두기(Resting)(전자 식품 또는 음료 용기를 하나의 표면 위에 놓고 손을 떼는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
m . 마시기 , 채우기 , 및 유체 수위(전자 식품 또는 음료 용기를 입으로 가져가는 것 ;
상기 용기로부터 음료를 마시는 것 ;
또는 상기 용기를 채우는 것으로 이루어진 그룹으로부터 선택된 동작은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
n . 열기 및 닫기(전자 식품 또는 음료 용기의 뚜껑을 여는 것 및 닫는 것은 상기 디스플레이와 연관된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
o . 멀티-디바이스 붓기(Pouring)(하나의 전자 식품 또는 음료 용기를 제2의 상기 용기 위에서 잡고 , 후속하여 상기 제1 용기를 기울이는 것은 상기 용기들 중 하나 또는 모두에 연결된 상기 컴퓨터 시스템으로의 입력으로서 역할한다) ;
p . 지문 스캐닝(곡선형 디스플레이 표면의 지정된 부분에 사용자의 하나 이상의 손가락을 놓는 것은 연관된 지문이 상기 곡선형 디스플레이 표면상의 정보에 대한 상기 사용자의 접근 권한을 확인할 목적으로 분석되게 한다) ;
및 q . 얼굴 인식(사용자의 얼굴은 상기 용기상의 정보에 대한 상기 사용자의 접근 권한을 확인할 목적으로 전자 식품 또는 음료 용기에 의해 식별된다)으로 이루어진 그룹으로부터 선택된 것을 특징으로 하는 센서를 통해 곡선형 디스플레이 표면과의 수동적인 상호작용을 감지함으로써 컴퓨터 시스템에 입력을 제공하는 방법 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (다이오드) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
KR20120093148A
CLAIM 1
재사용가능한 휴대용 상호작용 장치로서 , a . 맞춤형(customizable) 뚜껑 ;
b . 입력 및 출력 디바이스 선택부 ;
c . 플렉시블 E-잉크 , 플렉시블 유기발광 다이오드 (s thermal sensors) , 플렉시블 LED 어레이 , 프로젝션 , 레이저 , 및 페인터블(Paintable) 디스플레이로 이루어진 그룹으로부터 선택된 곡선형 디스플레이 기술을 가진 용기 부 ;
d . 배터리 , 파워 커넥터 , 네트워크 커넥터 , 시청각 커넥터 , 중앙처리장치 , 무선 네트워크 송수신기 , 그래픽 회로 보드 , RAM 메모리 , 펌웨어 ROM , 플래시 및 하드 디스크 드라이브로 이루어진 그룹으로부터 선택된 컴퓨팅 장치를 구비한 베이스를 포함하는 것을 특징으로 하는 재사용가능한 휴대용 상호작용 장치 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20110261058A1

Filed: 2010-05-04     Issued: 2011-10-27

Method for user input from the back panel of a handheld computerized device

(Original Assignee) Tong Luo     (Current Assignee) HANDSCAPE Inc A DELAWARE Corp

Tong Luo
US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices (imaginary plane) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20110261058A1
CLAIM 16
. The method of claim 11 , wherein said graphical representation of at least said user' ;
s fingers on said at least one graphics display screen is done by the steps of : using said assignment of said data on the location and movement of said user' ;
s fingers and/or hand to specific fingers on said biomechanical and anatomical model of said human hand to create a three dimensional model of the user' ;
s hand and fingers in memory ;
creating a two-dimensional projection of said three dimensional model of the user' ;
s hand and fingers in memory , wherein said two-dimensional projection is on an imaginary plane (area comprising vertices) that corresponds in both distance and orientation to said touchpad ;
and using said two-dimensional projection on said imaginary plane to generate said graphical representation of at least said user' ;
s fingers on said at least one graphics display screen .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices (imaginary plane) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20110261058A1
CLAIM 9
. The method of claim 1 , in which said at least one data entry location is highlighted on said at least one graphics display screen whenever said computerized device determines that at least one finger on said user' ;
s hand (s hand) has left the touchpad and the position and motion history of said at least one finger is consistent with a capability said at least one finger on said user' ;
s hand to strike a position on said touchpad that is consistent with the location of said at least one data entry location on said at least one graphics display screen .

US20110261058A1
CLAIM 16
. The method of claim 11 , wherein said graphical representation of at least said user' ;
s fingers on said at least one graphics display screen is done by the steps of : using said assignment of said data on the location and movement of said user' ;
s fingers and/or hand to specific fingers on said biomechanical and anatomical model of said human hand to create a three dimensional model of the user' ;
s hand and fingers in memory ;
creating a two-dimensional projection of said three dimensional model of the user' ;
s hand and fingers in memory , wherein said two-dimensional projection is on an imaginary plane (area comprising vertices) that corresponds in both distance and orientation to said touchpad ;
and using said two-dimensional projection on said imaginary plane to generate said graphical representation of at least said user' ;
s fingers on said at least one graphics display screen .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100141605A1

Filed: 2009-12-07     Issued: 2010-06-10

Flexible display device and data displaying method thereof

(Original Assignee) Samsung Electronics Co Ltd     (Current Assignee) Samsung Electronics Co Ltd

Tae Young Kang, Kyoung Woon Hahm, Hyun Jin Kim, JuYun Sung
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (second corner, first corner) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100141605A1
CLAIM 3
. The device of claim 2 , wherein the controller displays a folded image if a first bend event is detected in response to a first corner (first portion) of the display unit being bent , and adds a bookmark property to a property of the content displayed with the folded image .

US20100141605A1
CLAIM 5
. The device of claim 2 , wherein the display unit displays a menu image if a second bend event is detected as a second corner (first portion) of the display unit is bent .

US20100141605A1
CLAIM 21
. The device of claim 20 , wherein the display unit enlarges a size of a first content and displays the first content in an area in which a first bend event is detected in a first direction (first mode) , and reduces a size of second content and displays the second content in an area in which a second bend event is detected in a second direction .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20100141605A1
CLAIM 21
. The device of claim 20 , wherein the display unit enlarges a size of a first content and displays the first content in an area in which a first bend event is detected in a first direction (first mode) , and reduces a size of second content and displays the second content in an area in which a second bend event is detected in a second direction .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (second corner, first corner) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20100141605A1
CLAIM 3
. The device of claim 2 , wherein the controller displays a folded image if a first bend event is detected in response to a first corner (first portion) of the display unit being bent , and adds a bookmark property to a property of the content displayed with the folded image .

US20100141605A1
CLAIM 5
. The device of claim 2 , wherein the display unit displays a menu image if a second bend event is detected as a second corner (first portion) of the display unit is bent .

US20100141605A1
CLAIM 21
. The device of claim 20 , wherein the display unit enlarges a size of a first content and displays the first content in an area in which a first bend event is detected in a first direction (first mode) , and reduces a size of second content and displays the second content in an area in which a second bend event is detected in a second direction .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100066677A1

Filed: 2009-09-15     Issued: 2010-03-18

Computer Peripheral Device Used for Communication and as a Pointing Device

(Original Assignee) Legacy IP LLC     (Current Assignee) Edge Mobile Payments LLC

Peter Garrett, Paul Regen
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100066677A1
CLAIM 14
. The computer peripheral device of claim 4 wherein word processing command shortcuts can be placed on the touch screen and may be operated to perform tasks within a word processing application running on a connected computing system (touchscreen layer, touchscreen display) .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (host computer) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (host computer) using predefined set of gestures to toggle a full-screen mode .
US20100066677A1
CLAIM 2
. The computer peripheral device of claim 1 wherein the computing input device software enables a wireless mouse feature for operating a host computer (operating system status bar, status bar visibility) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20100066677A1
CLAIM 14
. The computer peripheral device of claim 4 wherein word processing command shortcuts can be placed on the touch screen and may be operated to perform tasks within a word processing application running on a connected computing system (touchscreen layer, touchscreen display) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20100066677A1
CLAIM 14
. The computer peripheral device of claim 4 wherein word processing command shortcuts can be placed on the touch screen and may be operated to perform tasks within a word processing application running on a connected computing system (touchscreen layer, touchscreen display) .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (digital camera) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100066677A1
CLAIM 8
. The computer peripheral device of claim 1 further comprising a digital camera (touchscreen area) device .

US20100066677A1
CLAIM 14
. The computer peripheral device of claim 4 wherein word processing command shortcuts can be placed on the touch screen and may be operated to perform tasks within a word processing application running on a connected computing system (touchscreen layer, touchscreen display) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100066677A1
CLAIM 14
. The computer peripheral device of claim 4 wherein word processing command shortcuts can be placed on the touch screen and may be operated to perform tasks within a word processing application running on a connected computing system (touchscreen layer, touchscreen display) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN101729658A

Filed: 2009-09-04     Issued: 2010-06-09

使用投影仪模块的节电移动终端及其方法

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

金钟焕
US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (响应于检测) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
CN101729658A
CLAIM 8
. —种在具有终端机身并配有投影仪模块的移动终端中节电的方法,所述方法包括: 向所述投影仪模块供电;从所述投影仪模块将图像投影到外界表面上;由位于所述终端机身中的运动传感器检测所述终端机身的运动;以及 响应于检测 (response instruction, s response instruction) 到运动而中断对所述投影仪模块的供电。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2178274A1

Filed: 2009-07-30     Issued: 2010-04-21

Power saving mobile terminal using projector module and method for same

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

Jong Hwan Kim
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input (user input) of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one (operating system status bar) of a user input of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (key input) displaying at least one information item from a set of information items (key input) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input of a pause command or manipulation of a key input (electronic device status display panel, information items) , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (key input) and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input of a pause command or manipulation of a key input (electronic device status display panel, information items) , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input (user input) of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input (user input) of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input (user input) of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2178274A1
CLAIM 3
The mobile terminal of claim 1 , wherein the specified event comprises one of a user input (user input) of a pause command or manipulation of a key input , a receipt of one of a voice call or a text message , an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status , a lapse of a predetermined time , and the user setting comprises an indication of the predetermined time .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100099457A1

Filed: 2009-07-24     Issued: 2010-04-22

Mobile communication terminal and power saving method thereof

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

Jong Hwan Kim
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100099457A1
CLAIM 3
. The mobile terminal of claim 2 , wherein the user event comprises a user input (user input) of a pause command .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20100099457A1
CLAIM 23
. The mobile terminal of claim 15 , wherein the specified event comprises one (operating system status bar) of an SMS/MMS , alarm/schedule management , a battery status or a radio signal reception status .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (key input) displaying at least one information item from a set of information items (key input) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20100099457A1
CLAIM 4
. The mobile terminal of claim 2 , wherein the user event comprises manipulation of a key input (electronic device status display panel, information items) .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (key input) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20100099457A1
CLAIM 4
. The mobile terminal of claim 2 , wherein the user event comprises manipulation of a key input (electronic device status display panel, information items) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20100099457A1
CLAIM 3
. The mobile terminal of claim 2 , wherein the user event comprises a user input (user input) of a pause command .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20100099457A1
CLAIM 3
. The mobile terminal of claim 2 , wherein the user event comprises a user input (user input) of a pause command .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100099457A1
CLAIM 3
. The mobile terminal of claim 2 , wherein the user event comprises a user input (user input) of a pause command .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100099457A1
CLAIM 3
. The mobile terminal of claim 2 , wherein the user event comprises a user input (user input) of a pause command .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100045705A1

Filed: 2009-07-10     Issued: 2010-02-25

Interaction techniques for flexible displays

(Original Assignee) Roel Vertegaal; Justin Lee; Behar Yves; Pichaya Puttorngul     

Roel Vertegaal, Justin Lee, Yves Béhar, Pichaya Puttorngul
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps, said input, said server) of response to a second set (following steps, said input, said server) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps, said input, said server) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100045705A1
CLAIM 3
. The apparatus of claim 2 wherein said input (second mode, second set, second portion) and output devices are disposed on said customizable lid .

US20100045705A1
CLAIM 8
. The method of claim 6 wherein said input to said computer system causes a command to execute on said computer system and wherein said command is selected from a group consisting of : a . Activate , wherein the software and display of said computer system awakes from sleep , disabling a screen saver or energy reduce state , or enabling advertisement activity , and b . Deactivate , wherein in the software and display of said computer goes to sleep , enabling a screen saver or energy reduced state , or disabling advertisement activity , and c . Zoom in or Enlarge , wherein an image or content of a file or document rendered on said display is enlarged or zoomed in on , and d . Zoom out or Reduce , wherein an image or content of a file or document rendered on said display is reduced or zoomed out of , and e . Organize , wherein some property of file(s) , digital information , text , images , or other computer content associated with or displaying on said display surface(s) is organized or sorted digitally in a way that matches properties of the physical computer system , such as physical order , and f . Scroll , wherein a segment of an image or content of a file , document or application is rendered on a display , said segment being not previously rendered , and said segment being spatially contiguous to the segment of said image or content that was previously rendered on said display , and g . Page Down , wherein a segment of the content of a file subsequent to the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said subsequent section to be rendered on said display , and h . Page Up , wherein a segment of the content of a file that precedes the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said preceding section to be rendered on said display , and i . Navigate , wherein an arbitrary section of the content of a file on said computer system , or some online content , hyperlink , or menu is navigated to such that it causes said the associated content to render on a display , and j . Page Back or Forward , wherein a section of the content of a file , or some online content , webpage or hyperlink that precedes or follows the section of said content currently rendered on a display , is navigated to such that it causes said content to be rendered on said display , and Open , Save or Close , wherein some file or digital information on said computer system is opened or closed , read into memory , or out to a permanent storage medium , and k . Move , Copy or Paste , wherein a section of the content of a file , image , text or some other digital information associated with said computer system or display is transferred to another computer system or display , or some different logical location on said same computer system or display , and Select , where graphical objects rendered on a display is selected such that it becomes the recipient of a subsequent action , input or command to the associated computer system , and l . Click , wherein an insertion point or cursor is moved to a specific location on a display , selecting or activating graphical objects underlying said location on said display , and m . Erase , wherein selected information or images , or content associated with said images on a computer system , is erased from said display and/or from the memory of said computer system , and n . Playback control , wherein a multimedia file , including graphics animation , video , sound or musical content on said computer system , is played at some speed , and wherein said speed is optionally controlled by said input , and o . Connect , wherein said computer system is connected through a computer network to another computer system , online server , communication tool or social networking site , and p . Share , wherein information on said computer system is placed on a computer server for the purpose of sharing said information with other users connected to said server (second mode, second set, second portion) , and q . Online status , wherein information about the usage of said computer system by the user , or some arbitrary status or attribute of said user , is shared with a computer server for the purpose of sharing said information with other users connected to said server , and r . Communicate , wherein said computer system serves as a communication device , and s . Advertise , wherein an advertisement is rendered on a display , and t . Order , wherein a beverage or food order selected on a display is processed and communicated to a vendor , vending machine , refilling station , or dispenser along with payment for said order , and u . Gamble and Game , wherein said computer system is used to play games , promotional games of chance , lotteries or the like , and v . Segmented Display , wherein said computer system displays an image across a multitude of displays , and w . Authenticate , wherein said computer system provides access to a particular user or usage of information on said computer system .

US20100045705A1
CLAIM 14
. A method for delivering promotional materials from a vendor or vending machine to a customer' ;
s interactive food or beverage container comprising the following steps (second mode, second set, second portion) : a . Optionally , identifying said container by said vendor or vending machine through said container being within threshold distance of said vendor or vending machine , and b . Optionally , identifying said container by said customer contacting said vendor or vending machine through a user interface disposed on said container , and c . Optionally , identifying said container by said customer placing an order with said vendor or vending machine , and wherein d . Said vendor or vending machine selecting said promotional materials on the basis of chance , characteristics of said customer' ;
s history of orders ;
or characteristics of said customer' ;
s order ;
and e . Digitally uploading said promotional materials to said container by a wireless or wired network , and f . Displaying or playing on said container of said promotional materials .

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one (RFID tags) or more touch-based soft buttons within the virtual bezel region .
US20100045705A1
CLAIM 2
. The apparatus of claim 1 wherein input and output devices selected from the group consisting of one or more 6 DOF accelerometer(s) , Gyroscope , Bend Sensor , Touch screen , Capacitive touch sensor , Heart rate sensor , Galvanic skin conductor sensor , Alpha Dial potentiometer , Video camera , Still camera , Hygrometer ;
Liquid Level Sensor ;
Potentiometric Liquid Chemical Sensor , Altimeter , Thermometer , Force sensor ;
Pressure Sensor ;
Microphone , GPS , Buttons , Photoelectric Sensor ;
Proximity Sensor , Electronic payment system , One or more RFID tags (add one) , Fingerprint reader , A water purification system , Ultraviolet light purification system , Carbon filtration system , Chemical or organic content analyzer , Bacterial content analyzer , Amplification system Speaker system and Compass .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items (placing one) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20100045705A1
CLAIM 6
. A method for providing input to a computer system by sensing manual interactions with a curved display surface through a sensor , wherein said interactions are selected from a group consisting of : a . Holding , wherein holding the curved display surface with one or two hands serves as input to the computer system associated with said curved display ;
b . Collocating or stacking , wherein collocating , collating or stacking multiple curved displays creates a single contiguous display surface consisting of individual displays , and wherein subsequent inputs operate on said larger display surface ;
c . Turning or Rotating , wherein rotating said curved display around an axis serves as input to the computer system associated with said display ;
d . Swirling , wherein moving said curved display in around an axis that is non-concentrical but parallel to some axis of said curved display serves as a means of input to the computer system associated with said curved display ;
e . Non-planar Strip Swiping , wherein moving one or more fingers along the top or bottom extremities of a curved display , or just above or below said display , serves as input to the computer system associated with said display ;
f . Three-finger Non-planar Pinching , wherein placing three fingers within a threshold proximity on a curved display serves as input to the computer system associated with said curved display ;
g . Pining and Swiping , wherein placing one (information items) finger on a fixed location on a curved display , while subsequently placing a second finger on said display , and wherein said second finger is subsequently moved away from said first finger , serves as input to the computer system associated with said display ;
h . Deforming , wherein deforming a curved display at one location serves as input to the computer system associated with said display ;
i . Rubing , wherein providing a rubbing action on a curved display , in which the hand , finger , or some tool is moved in a sinusoidal pattern over its surface , serves as input to the computer system associated with said display ;
j . Tilting , wherein tilting a curved display serves as input to the computer system associated with said display ;
k . Flicking or Tossing , wherein rapidly tilting a curved display , stopping and optionally returning to its approximate original orientation serves as input to the computer system associated with said display ;
l . Resting , wherein placing and releasing an electronic food or beverage container on a surface , serves as input to the computer system associated with said container ;
m . Drinking , Filling and Fluid Level , wherein an action selected from a group consisting of : bringing an electronic food or beverage container to the mouth ;
drinking a beverage from said container ;
or filling said container serves as input to the computer system associated with said container ;
n . Opening and closing , wherein opening and closing the lid of an electronic food or beverage container serves as input to the computer system associated with said container ;
o . Multi-device Pouring , wherein holding an electronic food or beverage container over a second said container , and subsequently tilting said first container , serves as input to the computer system associated with either or both containers ;
p . Fingerprint scanning , wherein placing one or more fingers of a user on a designated part of a curved display surface causes associated fingerprints to be analyzed with the purpose of authenticating access by said user to information on said curved display surface ;
q . Face detection , wherein the face of a user is identified by an electronic food or beverage container for the purpose of authenticating access of said the user to information on said container ;


US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (following steps, said input, said server) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (following steps, said input, said server) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20100045705A1
CLAIM 3
. The apparatus of claim 2 wherein said input (second mode, second set, second portion) and output devices are disposed on said customizable lid .

US20100045705A1
CLAIM 8
. The method of claim 6 wherein said input to said computer system causes a command to execute on said computer system and wherein said command is selected from a group consisting of : a . Activate , wherein the software and display of said computer system awakes from sleep , disabling a screen saver or energy reduce state , or enabling advertisement activity , and b . Deactivate , wherein in the software and display of said computer goes to sleep , enabling a screen saver or energy reduced state , or disabling advertisement activity , and c . Zoom in or Enlarge , wherein an image or content of a file or document rendered on said display is enlarged or zoomed in on , and d . Zoom out or Reduce , wherein an image or content of a file or document rendered on said display is reduced or zoomed out of , and e . Organize , wherein some property of file(s) , digital information , text , images , or other computer content associated with or displaying on said display surface(s) is organized or sorted digitally in a way that matches properties of the physical computer system , such as physical order , and f . Scroll , wherein a segment of an image or content of a file , document or application is rendered on a display , said segment being not previously rendered , and said segment being spatially contiguous to the segment of said image or content that was previously rendered on said display , and g . Page Down , wherein a segment of the content of a file subsequent to the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said subsequent section to be rendered on said display , and h . Page Up , wherein a segment of the content of a file that precedes the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said preceding section to be rendered on said display , and i . Navigate , wherein an arbitrary section of the content of a file on said computer system , or some online content , hyperlink , or menu is navigated to such that it causes said the associated content to render on a display , and j . Page Back or Forward , wherein a section of the content of a file , or some online content , webpage or hyperlink that precedes or follows the section of said content currently rendered on a display , is navigated to such that it causes said content to be rendered on said display , and Open , Save or Close , wherein some file or digital information on said computer system is opened or closed , read into memory , or out to a permanent storage medium , and k . Move , Copy or Paste , wherein a section of the content of a file , image , text or some other digital information associated with said computer system or display is transferred to another computer system or display , or some different logical location on said same computer system or display , and Select , where graphical objects rendered on a display is selected such that it becomes the recipient of a subsequent action , input or command to the associated computer system , and l . Click , wherein an insertion point or cursor is moved to a specific location on a display , selecting or activating graphical objects underlying said location on said display , and m . Erase , wherein selected information or images , or content associated with said images on a computer system , is erased from said display and/or from the memory of said computer system , and n . Playback control , wherein a multimedia file , including graphics animation , video , sound or musical content on said computer system , is played at some speed , and wherein said speed is optionally controlled by said input , and o . Connect , wherein said computer system is connected through a computer network to another computer system , online server , communication tool or social networking site , and p . Share , wherein information on said computer system is placed on a computer server for the purpose of sharing said information with other users connected to said server (second mode, second set, second portion) , and q . Online status , wherein information about the usage of said computer system by the user , or some arbitrary status or attribute of said user , is shared with a computer server for the purpose of sharing said information with other users connected to said server , and r . Communicate , wherein said computer system serves as a communication device , and s . Advertise , wherein an advertisement is rendered on a display , and t . Order , wherein a beverage or food order selected on a display is processed and communicated to a vendor , vending machine , refilling station , or dispenser along with payment for said order , and u . Gamble and Game , wherein said computer system is used to play games , promotional games of chance , lotteries or the like , and v . Segmented Display , wherein said computer system displays an image across a multitude of displays , and w . Authenticate , wherein said computer system provides access to a particular user or usage of information on said computer system .

US20100045705A1
CLAIM 14
. A method for delivering promotional materials from a vendor or vending machine to a customer' ;
s interactive food or beverage container comprising the following steps (second mode, second set, second portion) : a . Optionally , identifying said container by said vendor or vending machine through said container being within threshold distance of said vendor or vending machine , and b . Optionally , identifying said container by said customer contacting said vendor or vending machine through a user interface disposed on said container , and c . Optionally , identifying said container by said customer placing an order with said vendor or vending machine , and wherein d . Said vendor or vending machine selecting said promotional materials on the basis of chance , characteristics of said customer' ;
s history of orders ;
or characteristics of said customer' ;
s order ;
and e . Digitally uploading said promotional materials to said container by a wireless or wired network , and f . Displaying or playing on said container of said promotional materials .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (following steps, said input, said server) of response in the virtual bezel region .
US20100045705A1
CLAIM 3
. The apparatus of claim 2 wherein said input (second mode, second set, second portion) and output devices are disposed on said customizable lid .

US20100045705A1
CLAIM 8
. The method of claim 6 wherein said input to said computer system causes a command to execute on said computer system and wherein said command is selected from a group consisting of : a . Activate , wherein the software and display of said computer system awakes from sleep , disabling a screen saver or energy reduce state , or enabling advertisement activity , and b . Deactivate , wherein in the software and display of said computer goes to sleep , enabling a screen saver or energy reduced state , or disabling advertisement activity , and c . Zoom in or Enlarge , wherein an image or content of a file or document rendered on said display is enlarged or zoomed in on , and d . Zoom out or Reduce , wherein an image or content of a file or document rendered on said display is reduced or zoomed out of , and e . Organize , wherein some property of file(s) , digital information , text , images , or other computer content associated with or displaying on said display surface(s) is organized or sorted digitally in a way that matches properties of the physical computer system , such as physical order , and f . Scroll , wherein a segment of an image or content of a file , document or application is rendered on a display , said segment being not previously rendered , and said segment being spatially contiguous to the segment of said image or content that was previously rendered on said display , and g . Page Down , wherein a segment of the content of a file subsequent to the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said subsequent section to be rendered on said display , and h . Page Up , wherein a segment of the content of a file that precedes the section of said content of a file that is currently rendered on a display , is navigated to such that it causes said preceding section to be rendered on said display , and i . Navigate , wherein an arbitrary section of the content of a file on said computer system , or some online content , hyperlink , or menu is navigated to such that it causes said the associated content to render on a display , and j . Page Back or Forward , wherein a section of the content of a file , or some online content , webpage or hyperlink that precedes or follows the section of said content currently rendered on a display , is navigated to such that it causes said content to be rendered on said display , and Open , Save or Close , wherein some file or digital information on said computer system is opened or closed , read into memory , or out to a permanent storage medium , and k . Move , Copy or Paste , wherein a section of the content of a file , image , text or some other digital information associated with said computer system or display is transferred to another computer system or display , or some different logical location on said same computer system or display , and Select , where graphical objects rendered on a display is selected such that it becomes the recipient of a subsequent action , input or command to the associated computer system , and l . Click , wherein an insertion point or cursor is moved to a specific location on a display , selecting or activating graphical objects underlying said location on said display , and m . Erase , wherein selected information or images , or content associated with said images on a computer system , is erased from said display and/or from the memory of said computer system , and n . Playback control , wherein a multimedia file , including graphics animation , video , sound or musical content on said computer system , is played at some speed , and wherein said speed is optionally controlled by said input , and o . Connect , wherein said computer system is connected through a computer network to another computer system , online server , communication tool or social networking site , and p . Share , wherein information on said computer system is placed on a computer server for the purpose of sharing said information with other users connected to said server (second mode, second set, second portion) , and q . Online status , wherein information about the usage of said computer system by the user , or some arbitrary status or attribute of said user , is shared with a computer server for the purpose of sharing said information with other users connected to said server , and r . Communicate , wherein said computer system serves as a communication device , and s . Advertise , wherein an advertisement is rendered on a display , and t . Order , wherein a beverage or food order selected on a display is processed and communicated to a vendor , vending machine , refilling station , or dispenser along with payment for said order , and u . Gamble and Game , wherein said computer system is used to play games , promotional games of chance , lotteries or the like , and v . Segmented Display , wherein said computer system displays an image across a multitude of displays , and w . Authenticate , wherein said computer system provides access to a particular user or usage of information on said computer system .

US20100045705A1
CLAIM 14
. A method for delivering promotional materials from a vendor or vending machine to a customer' ;
s interactive food or beverage container comprising the following steps (second mode, second set, second portion) : a . Optionally , identifying said container by said vendor or vending machine through said container being within threshold distance of said vendor or vending machine , and b . Optionally , identifying said container by said customer contacting said vendor or vending machine through a user interface disposed on said container , and c . Optionally , identifying said container by said customer placing an order with said vendor or vending machine , and wherein d . Said vendor or vending machine selecting said promotional materials on the basis of chance , characteristics of said customer' ;
s history of orders ;
or characteristics of said customer' ;
s order ;
and e . Digitally uploading said promotional materials to said container by a wireless or wired network , and f . Displaying or playing on said container of said promotional materials .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100007632A1

Filed: 2009-07-01     Issued: 2010-01-14

Semiconductor device

(Original Assignee) Semiconductor Energy Laboratory Co Ltd     (Current Assignee) Semiconductor Energy Laboratory Co Ltd

Shunpei Yamazaki
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (second pixel) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100007632A1
CLAIM 1
. A semiconductor device comprising : a display portion including a first pixel electrode , a second pixel (first portion) electrode , a plurality of photo sensors between the first pixel electrode and the second pixel electrode , and a plurality of color filters ;
and a driver circuit portion configured to drive the display portion , wherein the driver circuit portion includes a transistor including a single crystal semiconductor layer .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (second pixel) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20100007632A1
CLAIM 1
. A semiconductor device comprising : a display portion including a first pixel electrode , a second pixel (first portion) electrode , a plurality of photo sensors between the first pixel electrode and the second pixel electrode , and a plurality of color filters ;
and a driver circuit portion configured to drive the display portion , wherein the driver circuit portion includes a transistor including a single crystal semiconductor layer .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (red color) to define a personalized holding pattern (liquid crystal element) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100007632A1
CLAIM 3
. The semiconductor device according to claim 1 , wherein the display portion includes a liquid crystal element (holding pattern) .

US20100007632A1
CLAIM 6
. The semiconductor device according to claim 1 , wherein each of the plurality of color filters is one of a red color (usage frequency) filter , a green color filter , and a blue color filter .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (red color) to define a personalized holding pattern (liquid crystal element) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100007632A1
CLAIM 3
. The semiconductor device according to claim 1 , wherein the display portion includes a liquid crystal element (holding pattern) .

US20100007632A1
CLAIM 6
. The semiconductor device according to claim 1 , wherein each of the plurality of color filters is one of a red color (usage frequency) filter , a green color filter , and a blue color filter .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2161645A2

Filed: 2009-05-15     Issued: 2010-03-10

Portable terminal and driving method of the same

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

Jong Hwan Kim, Tae Jin Lee
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (thumbnail image) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2161645A2
CLAIM 4
The portable terminal of any one of claims 1 to 3 , wherein said input (second mode) from a user determined by the controller depending on the motion pattern is a digit .

EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (thumbnail image) .
EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (thumbnail image) .
EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
EP2161645A2
CLAIM 3
The portable terminal of claim 1 or 2 , wherein the manipulation unit comprises one (operating system status bar) of a touch screen or a designated key button .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (thumbnail image) comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items (input data) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2161645A2
CLAIM 3
The portable terminal of claim 1 or 2 , wherein the manipulation unit comprises one of a touch screen (electronic device status display panel) or a designated key button .

EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .

EP2161645A2
CLAIM 6
The portable terminal of any one of claims 1 to 5 , further comprising : a storage containing a motion pattern-input data (information items) base , wherein the controller is configured to access the motion pattern-input database to determine the input from the user .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2161645A2
CLAIM 3
The portable terminal of claim 1 or 2 , wherein the manipulation unit comprises one of a touch screen (electronic device status display panel) or a designated key button .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (thumbnail image) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2161645A2
CLAIM 4
The portable terminal of any one of claims 1 to 3 , wherein said input (second mode) from a user determined by the controller depending on the motion pattern is a digit .

EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (thumbnail image) , the gestural software application configured to produce the second mode (said input) of response in the virtual bezel region .
EP2161645A2
CLAIM 4
The portable terminal of any one of claims 1 to 3 , wherein said input (second mode) from a user determined by the controller depending on the motion pattern is a digit .

EP2161645A2
CLAIM 5
The portable terminal of any one of claims 1 to 4 , wherein , when the motion pattern is sensed during manual manipulation of the manipulation unit , the controller is configured to perform at least one action selected in the group of actions constituted of : setting a telephone number as a hot key number ;
calling a predetermined hot key number ;
transmitting a predetermined message to a phone number ;
selecting and displaying an item of a displayed list ;
selecting and enlarging a thumbnail image (display screen) ;
scrolling through a plurality of thumbnails images ;
switching images on a display ;
scrolling through a text ;
enlarging or reducing a text on a display ;
enlarging or reducing a displayed video reproduction area on a display ;
increasing or reducing at least one of bell sound , vibration strength and key button volume on a display ;
and/or changing a viewing channel when the portable terminal is in a broadcasting reception mode .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN101655769A

Filed: 2009-03-23     Issued: 2010-02-24

便携式终端及其驱动方法

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

金钟焕, 李泰镇
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel (且显示) region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 3
. The display system according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel (且显示) region is processed as a touch-based input within the active touchscreen region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 4
. The display system according to claim 1 , wherein a touch-based input originating in the virtual bezel (且显示) region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (且显示) region is processed as a multi-touch input within the virtual bezel region of the display screen .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (且显示) region is processed as a multi-touch input within the active touchscreen region of the display screen .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (且显示) region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel (且显示) region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 9
. The display system according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel (且显示) region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel (且显示) region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel (且显示) region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel (且显示) display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel (且显示) display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 16
. A method of defining a virtual bezel (且显示) region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel (且显示) region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (包括使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

CN101655769A
CLAIM 25
. 如权利要求16所述的方法,其中所述(b)步骤包括使用 (usage frequency) 运 动图形-指令数据库,在所述运动图形-指令数据库中一个字段是特定键 或触摸屏的部分区域,另一字段是运动图形,以及又一字段是指令。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel (且显示) region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (包括使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN101655769A
CLAIM 9
. 如权利要求3所述的便携式终端,其中在按压所述触摸屏或指 定的键按钮期间应用所述指定的运动图形并且显示 (virtual bezel, virtual bezel region) 预定的项列表或縮 略图的情况下,所述控制器在整个屏幕上显示感兴趣的列表项或縮略 图。

CN101655769A
CLAIM 25
. 如权利要求16所述的方法,其中所述(b)步骤包括使用 (usage frequency) 运 动图形-指令数据库,在所述运动图形-指令数据库中一个字段是特定键 或触摸屏的部分区域,另一字段是运动图形,以及又一字段是指令。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20090259969A1

Filed: 2009-02-07     Issued: 2009-10-15

Multimedia client interface devices and methods

(Original Assignee) Matt Pallakoff     

Matt Pallakoff
US9645663B2
CLAIM 1
. A display system for an electronic device (other port, user input) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (first type) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (other port, user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type (first set) configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (other port, user input) for the gestural hardware on how a multi-touch input will be processed .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (other port, user input) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 13
. The electronic device (other port, user input) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 14
. An electronic device (other port, user input) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (other port, user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 15
. The electronic device (other port, user input) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (other port, user input) having a touchscreen display (contact point) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (other port, user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point (touchscreen display) , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (other port, user input) having a touchscreen display (contact point) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (other port, user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point (touchscreen display) , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (other port, user input) having a touchscreen display (contact point) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (other port, user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090259969A1
CLAIM 1
. A method of providing a user interface for a mobile device , the method comprising : (a) displaying one or more first virtual selectable items on the touch-screen display having a first type configured to respond to user input (electronic device, user input) on the touch-screen indicating a user DOWN contact with the touchscreen ;
and (b) simultaneously to step (a) , displaying one or more second virtual selectable items in a web-page area of a browser window on the touch-screen display , the one or more second virtual selectable items having a second type configured to respond to user input on the touch-screen indicating a user UP contact release from the touchscreen ;
(c) receiving a user finger input on the touch-screen indicating a user DOWN contact in an area of the touchscreen associated with a selected one of the second virtual selectable items ;
and (d) in response to step (c) , displaying a box or bubble substantially above and/or to the side of the finger contact point (touchscreen display) , where said box or bubble shows said one of second virtual selectable items or information identifying said one of first virtual selectable items—so that the box or bubble shows an indication of which item is being pressed even if the finger is covering the item .

US20090259969A1
CLAIM 4
. The method of claim 1 , further comprising , if the user' ;
s finger moves to cover another port (electronic device, user input) ion of the display without lifting , updating the bubble' ;
s content to correspondingly display the newly covered information or associated identifying information .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
KR20100065418A

Filed: 2008-12-08     Issued: 2010-06-17

가요성 표시부를 가지는 단말기 및 그의 데이터 표시 방법

(Original Assignee) 삼성전자주식회사     

강태영, 김현진, 성주연, 함경운
US9645663B2
CLAIM 1
. A display system (중앙을) for an electronic device (영역들) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 2
. The display system (중앙을) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 3
. The display system (중앙을) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 4
. The display system (중앙을) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 5
. The display system (중앙을) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 6
. The display system (중앙을) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 7
. The display system (중앙을) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (영역들) for the gestural hardware on how a multi-touch input will be processed .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 8
. The display system (중앙을) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 9
. The display system (중앙을) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 10
. The display system (중앙을) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 11
. The display system (중앙을) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 12
. The display system (중앙을) according to claim 9 , wherein the display screen comprises an electronic device (영역들) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 (display system) 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 13
. The electronic device (영역들) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 14
. An electronic device (영역들) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 15
. The electronic device (영역들) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (영역들) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
KR20100065418A
CLAIM 10
제 1 항에 있어서 , 상기 제어부는 상기 단말기의 중앙을 중심으로 비대칭 구부러짐에 따른 휨 이벤트가 발생하면 , 다수개의 텍스트 페이지 일부 영역들 (electronic device) 을 일정 폭 내에 일정 간격으로 출력하도록 제어하는 것을 특징으로 하는 가요성 표시부를 가지는 단말기 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20100030549A1

Filed: 2008-07-31     Issued: 2010-02-04

Mobile device having human language translation capability with positional feedback

(Original Assignee) Apple Inc     (Current Assignee) Apple Inc

Michael M. Lee, Justin Gregg, Chad G. Seguin
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (ninety degrees) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (second modes) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US20100030549A1
CLAIM 3
. The device of claim 2 wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and the second virtual keyboard when the device is rotated by about ninety degrees (first mode) or more in a plane of the touch sensitive screen .

US20100030549A1
CLAIM 16
. The device of claim 15 wherein the translator is to switch between the first and second modes (second mode, screen mode) in response to the device undergoing a rapid translation movement that is brought about by a user tapping the device .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (ninety degrees) of response in the active touchscreen region .
US20100030549A1
CLAIM 3
. The device of claim 2 wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and the second virtual keyboard when the device is rotated by about ninety degrees (first mode) or more in a plane of the touch sensitive screen .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US20100030549A1
CLAIM 17
. The device of claim 15 wherein the translator is to switch between the first and second modes in response to the device undergoing a rotation by about 90 degrees or more in a plane or the touch screen (electronic device status display panel) .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US20100030549A1
CLAIM 17
. The device of claim 15 wherein the translator is to switch between the first and second modes in response to the device undergoing a rotation by about 90 degrees or more in a plane or the touch screen (electronic device status display panel) .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (ninety degrees) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (second modes) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US20100030549A1
CLAIM 3
. The device of claim 2 wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and the second virtual keyboard when the device is rotated by about ninety degrees (first mode) or more in a plane of the touch sensitive screen .

US20100030549A1
CLAIM 16
. The device of claim 15 wherein the translator is to switch between the first and second modes (second mode, screen mode) in response to the device undergoing a rapid translation movement that is brought about by a user tapping the device .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (second modes) of response in the virtual bezel region .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US20100030549A1
CLAIM 16
. The device of claim 15 wherein the translator is to switch between the first and second modes (second mode, screen mode) in response to the device undergoing a rapid translation movement that is brought about by a user tapping the device .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20100030549A1
CLAIM 1
. A mobile electronic device (electronic device) comprising : a touch sensitive screen ;
an accelerometer ;
and a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen , into a second human language , and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language , in response to the accelerometer detecting a change in the physical orientation of the device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN101431563A

Filed: 2008-07-16     Issued: 2009-05-13

移动终端

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

吴汉奎
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (的外侧) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN101431563A
CLAIM 4
. 如权利要求1所述的移动终端,其特征在于,还包括一体地覆盖触摸板的 窗体,所述窗体被设置在触摸板的外侧 (second portion)

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one (包括一) or more touch-based soft buttons within the virtual bezel region .
CN101431563A
CLAIM 4
. 如权利要求1所述的移动终端,其特征在于,还包括一 (add one) 体地覆盖触摸板的 窗体,所述窗体被设置在触摸板的外侧。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (的外侧) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN101431563A
CLAIM 4
. 如权利要求1所述的移动终端,其特征在于,还包括一体地覆盖触摸板的 窗体,所述窗体被设置在触摸板的外侧 (second portion)




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20090122026A1

Filed: 2008-06-27     Issued: 2009-05-14

Mobile terminal

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

Han-Gyu OH
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (non-edge portion) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (inner side) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (opposite surface) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20090122026A1
CLAIM 3
. The mobile terminal of claim 1 , wherein a display module that outputs visual information and a circuit board are configured at an inner side (touchscreen layer) of the touch sheet , and the display module is disposed at the first region and the circuit board is disposed at the second region .

US20090122026A1
CLAIM 7
. The mobile terminal of claim 6 , wherein a conductive shielding line for shielding the data lines of the touch screen pattern portion is configured at edges of the first region of the touch sheet , and positioned on the opposite surface (user input, user input area) of the insulation layer on which the data lines of the touch screen pattern portion are disposed .

US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (non-edge portion) .
US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (non-edge portion) .
US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (non-edge portion) comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20090122026A1
CLAIM 1
. A mobile terminal comprising : a terminal body having a first and a second region ;
and a touch sheet that senses a touch applied to the first and second regions , wherein the touch sheet comprises a touch screen (electronic device status display panel) pattern portion that includes multiple conductive lines to sense a touch applied to the first region and a touch button pattern portion disposed to have a certain area on the second region .

US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20090122026A1
CLAIM 1
. A mobile terminal comprising : a terminal body having a first and a second region ;
and a touch sheet that senses a touch applied to the first and second regions , wherein the touch sheet comprises a touch screen (electronic device status display panel) pattern portion that includes multiple conductive lines to sense a touch applied to the first region and a touch button pattern portion disposed to have a certain area on the second region .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (non-edge portion) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (inner side) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (opposite surface) intended to affect the display of the first portion of the content on the active touchscreen region .
US20090122026A1
CLAIM 3
. The mobile terminal of claim 1 , wherein a display module that outputs visual information and a circuit board are configured at an inner side (touchscreen layer) of the touch sheet , and the display module is disposed at the first region and the circuit board is disposed at the second region .

US20090122026A1
CLAIM 7
. The mobile terminal of claim 6 , wherein a conductive shielding line for shielding the data lines of the touch screen pattern portion is configured at edges of the first region of the touch sheet , and positioned on the opposite surface (user input, user input area) of the insulation layer on which the data lines of the touch screen pattern portion are disposed .

US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (non-edge portion) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20090122026A1
CLAIM 27
. A user interface comprising : a touch sensitive screen with a planar surface to receive user inputs and to display information ;
and at least one mechanical button implemented at a non-edge portion (display screen) in the planar surface of the touch sensitive screen , wherein the touch sensitive screen has a first section with a first pattern of touch sensors and a second section with a second pattern of touch sensors .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (opposite surface) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20090122026A1
CLAIM 7
. The mobile terminal of claim 6 , wherein a conductive shielding line for shielding the data lines of the touch screen pattern portion is configured at edges of the first region of the touch sheet , and positioned on the opposite surface (user input, user input area) of the insulation layer on which the data lines of the touch screen pattern portion are disposed .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (opposite surface) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090122026A1
CLAIM 1
. A mobile terminal comprising : a terminal body having a first and a second region ;
and a touch sheet that senses a touch applied to the first and second regions , wherein the touch sheet comprises a touch screen pattern portion that includes multiple conductive lines to sense a touch applied to the first region (holding pattern) and a touch button pattern portion disposed to have a certain area on the second region .

US20090122026A1
CLAIM 7
. The mobile terminal of claim 6 , wherein a conductive shielding line for shielding the data lines of the touch screen pattern portion is configured at edges of the first region of the touch sheet , and positioned on the opposite surface (user input, user input area) of the insulation layer on which the data lines of the touch screen pattern portion are disposed .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (opposite surface) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090122026A1
CLAIM 1
. A mobile terminal comprising : a terminal body having a first and a second region ;
and a touch sheet that senses a touch applied to the first and second regions , wherein the touch sheet comprises a touch screen pattern portion that includes multiple conductive lines to sense a touch applied to the first region (holding pattern) and a touch button pattern portion disposed to have a certain area on the second region .

US20090122026A1
CLAIM 7
. The mobile terminal of claim 6 , wherein a conductive shielding line for shielding the data lines of the touch screen pattern portion is configured at edges of the first region of the touch sheet , and positioned on the opposite surface (user input, user input area) of the insulation layer on which the data lines of the touch screen pattern portion are disposed .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20090254855A1

Filed: 2008-04-08     Issued: 2009-10-08

Communication terminals with superimposed user interface

(Original Assignee) Sony Mobile Communications AB     (Current Assignee) Sony Mobile Communications AB

Martin Kretz, Tom Gajdos
US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (user input device) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090254855A1
CLAIM 10
. A method of operating an electronic device including a user input device (touchscreen area) and a display screen , the method comprising : superimposing a moving picture of a pointing object that is external to the electronic device onto the display screen wherein the picture of the pointing object is at least partially transparent ;
and interpreting a plurality of features of the pointing object as selection pointers so that a movement of the pointing object relative to the display screen is interpreted as movement of a plurality of selection pointers .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20090254855A1
CLAIM 21
. The electronic device of claim 1 , wherein the image representative of the pointing object comprises an image of a user' ;
s hand (s hand) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2009071336A2

Filed: 2008-02-21     Issued: 2009-06-11

Method for using accelerometer detected imagined key press

(Original Assignee) Nokia Corporation     

Vesa Luiro, Markku Pulkkinen, Anssi SAARIMÄKI, Antti Helander
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (receiving user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2009071336A2
CLAIM 1
. A method for defining an imaginary key on a portable apparatus ^ comprising ;
receiving user input (user input, user input area) , wherein said user input comprises of one or more taps on a portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at (east one location of said one or more taps on said portable apparatus ;
and defining said imaginary key on said portable apparatus on said determined location ,

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2009071336A2
CLAIM 34
, A user interface comprising a casing , a display , and an input receiver , wherein said user interface is arranged to ;
enable said portable apparatus to receive user input , wherein said user input comprises one (operating system status bar) or more taps on said portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at least one location of said one or more taps on said portable apparatus ;
and defining an imaginary key on said portable apparatus on said determined location .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (receiving user input) intended to affect the display of the first portion of the content on the active touchscreen region .
WO2009071336A2
CLAIM 1
. A method for defining an imaginary key on a portable apparatus ^ comprising ;
receiving user input (user input, user input area) , wherein said user input comprises of one or more taps on a portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at (east one location of said one or more taps on said portable apparatus ;
and defining said imaginary key on said portable apparatus on said determined location ,

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (receiving user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2009071336A2
CLAIM 1
. A method for defining an imaginary key on a portable apparatus ^ comprising ;
receiving user input (user input, user input area) , wherein said user input comprises of one or more taps on a portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at (east one location of said one or more taps on said portable apparatus ;
and defining said imaginary key on said portable apparatus on said determined location ,

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (receiving user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2009071336A2
CLAIM 1
. A method for defining an imaginary key on a portable apparatus ^ comprising ;
receiving user input (user input, user input area) , wherein said user input comprises of one or more taps on a portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at (east one location of said one or more taps on said portable apparatus ;
and defining said imaginary key on said portable apparatus on said determined location ,

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature (time frame) from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (receiving user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2009071336A2
CLAIM 1
. A method for defining an imaginary key on a portable apparatus ^ comprising ;
receiving user input (user input, user input area) , wherein said user input comprises of one or more taps on a portable apparatus ;
detecting said one or more taps on said portable apparatus ;
determining at (east one location of said one or more taps on said portable apparatus ;
and defining said imaginary key on said portable apparatus on said determined location ,

WO2009071336A2
CLAIM 3
. The method according to any one of claims 1-2 , wherein said one or more taps being tapped in sequence within a predetermined time frame (heat signature) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2058729A1

Filed: 2008-02-18     Issued: 2009-05-13

Mobile terminal

(Original Assignee) LG Electronics Inc     (Current Assignee) LG Electronics Inc

Han-Gyu Oh
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (inner side) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2058729A1
CLAIM 3
The mobile terminal of any one of claims 1 and 2 , further comprising , on the inner side (touchscreen layer) of the touch sheet assembly , a display module (41) disposed below the first input region and a circuit board (52) disposed below the second input region .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (inner side) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2058729A1
CLAIM 3
The mobile terminal of any one of claims 1 and 2 , further comprising , on the inner side (touchscreen layer) of the touch sheet assembly , a display module (41) disposed below the first input region and a circuit board (52) disposed below the second input region .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2058729A1
CLAIM 6
The mobile terminal of claim 5 , wherein touch sensitive areas of the first and second input regions respectively connect with data lines (65a , 66) to transmit signals , and the data lines (66) of the first region (holding pattern) are aligned along edges of the transmissive insulation layer and are congregated with data lines (65a) of the second region so as to be connected with an FPCB (Flexible Printed Circuit Board) (70) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2058729A1
CLAIM 6
The mobile terminal of claim 5 , wherein touch sensitive areas of the first and second input regions respectively connect with data lines (65a , 66) to transmit signals , and the data lines (66) of the first region (holding pattern) are aligned along edges of the transmissive insulation layer and are congregated with data lines (65a) of the second region so as to be connected with an FPCB (Flexible Printed Circuit Board) (70) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20080088602A1

Filed: 2007-12-28     Issued: 2008-04-17

Multi-functional hand-held device

(Original Assignee) Apple Inc     (Current Assignee) Apple Inc

Steven Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (current touch) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (including one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20080088602A1
CLAIM 1
. A hand-held multi-functional electronic device , comprising : a combined touch screen and user interface display ;
and a processing unit operatively connected to said touch screen and display , said processing unit capable of identifying and tracking a plurality of concurrent moving touch inputs from a user via said touch screen and discriminating a user requested action from the plurality of touch inputs ;
wherein the touch screen is adapted for recognition of a plurality of concurrent touch (first set) down locations anywhere on the surface .

US20080088602A1
CLAIM 12
. A hand-held electronic device as recited in claim 10 , wherein said user interface display comprises a standard region and a control region , the standard region being used to display data , and the control region including one (second set) or more virtual controls for user interaction .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20080088602A1
CLAIM 4
. A hand-held electronic device as recited in claim 1 , wherein said display is a substantially full screen (touchscreen display) display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20080088602A1
CLAIM 4
. A hand-held electronic device as recited in claim 1 , wherein said display is a substantially full screen (touchscreen display) display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20080088602A1
CLAIM 4
. A hand-held electronic device as recited in claim 1 , wherein said display is a substantially full screen (touchscreen display) display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2008005505A2

Filed: 2007-07-05     Issued: 2008-01-10

Capacitance sensing electrode with integrated i/o device

(Original Assignee) Apple Inc.     

Steve Porter Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen (touch pad) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (angular position) of response to a first set (angular position) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first communication) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 16
. The I/O device as recited in claim 15 wherein the I/O device further includes a first communication (first portion) line electrically coupled to the electrode and a second communication line electrically coupled to the second connection points of the one or more I/O mechanisms .

WO2008005505A2
CLAIM 3
Q . The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (angular position) of response in the active touchscreen region .
WO2008005505A2
CLAIM 3
Q . The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch pad) .
WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch pad) .
WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one (touch pad) or more touch-based soft buttons within the virtual bezel region .
WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch pad) comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

WO2008005505A2
CLAIM 33
. The touch device as recited in claim 25 wherein the touch device is a touch screen (electronic device status display panel) .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 33
. The touch device as recited in claim 25 wherein the touch device is a touch screen (electronic device status display panel) .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen (touch pad) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (angular position) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first communication) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 16
. The I/O device as recited in claim 15 wherein the I/O device further includes a first communication (first portion) line electrically coupled to the electrode and a second communication line electrically coupled to the second connection points of the one or more I/O mechanisms .

WO2008005505A2
CLAIM 3
Q . The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch pad) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (second communication lines) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

WO2008005505A2
CLAIM 18
. The I/O device as recited in claim 16 further comprising a capacitor positioned between the first and second communication lines (screen mode, s response instruction) in order include the one or more I/O mechanism in the total electrode area during capacitive sensing .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices (output mechanism) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2008005505A2
CLAIM 6
. The touch sensing device as recited in claim 1 wherein the I/O mechanism is an output mechanism (area comprising vertices) .

WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature (same communication) from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices (output mechanism) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2008005505A2
CLAIM 1
. A touch sensing device , comprising : one or more multifunctional nodes each of which represents a single touch pixel , each multifunctional node including a touch sensor with one or more integrated I/O mechanisms , the touch sensor and integrated I/O mechanisms sharing the same communication (heat signature) lines and I/O pins of a controller during operation of the touch sensing device .

WO2008005505A2
CLAIM 6
. The touch sensing device as recited in claim 1 wherein the I/O mechanism is an output mechanism (area comprising vertices) .

WO2008005505A2
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising ;
a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN101432677A

Filed: 2007-02-21     Issued: 2009-05-13

具有显示器和用于用户界面及控制的周围触摸敏感边框的电子设备

(Original Assignee) Apple Computer Inc     (Current Assignee) Apple Inc

N·金, D·克尔, P·赫斯特, S·P·豪泰灵
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (的多个电) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN101432677A
CLAIM 16
. 根据权利要求10所述的电子设备,其中所述触摸敏感边框包括:围绕所述显示器的周边的至少一部分排列的多个导电衬垫;用于互连所述导电衬垫的多个电 (first set) 阻器;以及与至少三个距离近似相等的互连的垫的点相耦合的集成电路,所 述集成电路感测所述至少三个点的测量结果并且确定在所述边框上 发生的触摸事件的位置。

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one (相比较, 包括一) or more touch-based soft buttons within the virtual bezel region .
CN101432677A
CLAIM 19
. 根据权利要求10所述的电子设备,其中为了确定所获取的 触摸数据是否调用至少一个控制,所述处理电路被配置成:将所述触摸敏感边框的至少 一 个区域指定给至少 一 个控制; 将所获取的触摸数据与所述触摸敏感边框的至少一个指定区域 相比较 (add one) ;以及根据所述比较确定所获取的触摸数据调用的至少一个控制。

CN101432677A
CLAIM 28
. 根据权利要求25所述的电子设备,其中所述多点触摸输入 边框包括一 (add one) 个阵列,所述阵列具有在基本围绕所述显示器的周边以行 和列排列的多个电容传感器。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items (的位置) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
CN101432677A
CLAIM 1
. 一种电子设备,包括:位于所述电子设备之上并具有周边的显示器;至少一个位于所述电子设备之上并与所述显示器的周边的至少一部分相邻的触摸敏感表面;以及以可操作的方式与所述显示器以及所述至少一个触摸敏感表面相连的处理电路,所述处理电路被配置成:将所述至少一个触摸敏感表面的至少一个区域指定给至少一个控制;为所述至少一个控制产生至少一个视觉指南;以及在显示器上与指定给所述至少一个控制的至少一个区域相邻的位置 (information items) 上呈现用于显示的所述至少一个视觉指南。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (具有周) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN101432677A
CLAIM 1
. 一种电子设备,包括:位于所述电子设备之上并具有周 (virtual bezel display screen) 边的显示器;至少一个位于所述电子设备之上并与所述显示器的周边的至少一部分相邻的触摸敏感表面;以及以可操作的方式与所述显示器以及所述至少一个触摸敏感表面相连的处理电路,所述处理电路被配置成:将所述至少一个触摸敏感表面的至少一个区域指定给至少一个控制;为所述至少一个控制产生至少一个视觉指南;以及在显示器上与指定给所述至少一个控制的至少一个区域相邻的位置上呈现用于显示的所述至少一个视觉指南。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (具有周) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
CN101432677A
CLAIM 1
. 一种电子设备,包括:位于所述电子设备之上并具有周 (virtual bezel display screen) 边的显示器;至少一个位于所述电子设备之上并与所述显示器的周边的至少一部分相邻的触摸敏感表面;以及以可操作的方式与所述显示器以及所述至少一个触摸敏感表面相连的处理电路,所述处理电路被配置成:将所述至少一个触摸敏感表面的至少一个区域指定给至少一个控制;为所述至少一个控制产生至少一个视觉指南;以及在显示器上与指定给所述至少一个控制的至少一个区域相邻的位置上呈现用于显示的所述至少一个视觉指南。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20080007533A1

Filed: 2006-07-06     Issued: 2008-01-10

Capacitance sensing electrode with integrated I/O mechanism

(Original Assignee) Apple Computer Inc     (Current Assignee) Apple Inc

Steve P. Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen (touch pad) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (angular position) of response to a first set (angular position) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first communication) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 16
. The I/O device as recited in claim 15 wherein the I/O device further includes a first communication (first portion) line electrically coupled to the electrode and a second communication line electrically coupled to the second connection points of the one or more I/O mechanisms .

US20080007533A1
CLAIM 30
. The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (angular position) of response in the active touchscreen region .
US20080007533A1
CLAIM 30
. The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch pad) .
US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch pad) .
US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one (touch pad) or more touch-based soft buttons within the virtual bezel region .
US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch pad) comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US20080007533A1
CLAIM 33
. The touch device as recited in claim 25 wherein the touch device is a touch screen (electronic device status display panel) .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 33
. The touch device as recited in claim 25 wherein the touch device is a touch screen (electronic device status display panel) .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen (touch pad) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (angular position) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first communication) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 16
. The I/O device as recited in claim 15 wherein the I/O device further includes a first communication (first portion) line electrically coupled to the electrode and a second communication line electrically coupled to the second connection points of the one or more I/O mechanisms .

US20080007533A1
CLAIM 30
. The touch device as recited in claim 25 wherein the touch sensing nodes are laid out in a circular fashion such that each touch sensing node represents a distinct angular position (first mode, first set) within the touch plane .

US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch pad) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 32
. The touch device as recited in claim 25 wherein the touch device is a touch pad (display screen, add one) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (second communication lines) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US20080007533A1
CLAIM 18
. The I/O device as recited in claim 16 further comprising a capacitor positioned between the first and second communication lines (screen mode, s response instruction) in order include the one or more I/O mechanism in the total electrode area during capacitive sensing .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices (output mechanism) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20080007533A1
CLAIM 6
. The touch sensing device as recited in claim 1 wherein the I/O mechanism is an output mechanism (area comprising vertices) .

US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature (same communication) from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices (output mechanism) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20080007533A1
CLAIM 1
. A touch sensing device , comprising : one or more multifunctional nodes each of which represents a single touch pixel , each multifunctional node including a touch sensor with one or more integrated I/O mechanisms , the touch sensor and integrated I/O mechanisms sharing the same communication (heat signature) lines and I/O pins of a controller during operation of the touch sensing device .

US20080007533A1
CLAIM 6
. The touch sensing device as recited in claim 1 wherein the I/O mechanism is an output mechanism (area comprising vertices) .

US20080007533A1
CLAIM 13
. An I/O device for use in a user interface of an electronic device (electronic device) , the I/O device comprising : a capacitive sensing electrode ;
one or more I/O mechanisms integrated with the capacitive sensing electrode such that the electrode and I/O mechanisms are incorporated into a single defined node of the I/O device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20060238517A1

Filed: 2006-06-23     Issued: 2006-10-26

Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control

(Original Assignee) Apple Computer Inc     (Current Assignee) Apple Inc

Nick King, Duncan Kerr, Paul Herbst, Steven Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (designating one) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20060238517A1
CLAIM 25
. An electronic device , comprising : a display positioned on the electronic device and having a perimeter ;
a multi-touch input bezel positioned on the electronic device substantially around the perimeter of the display ;
and processing circuitry operatively connected to the display and the multi-touch input bezel , the processing circuitry designating one (touchscreen layer) or more areas on the multi-touch input bezel for one or more controls used to operate the electronic device , the processing circuitry generating one or more visual guides corresponding to the one or more controls and sending visual data to the display for displaying the one or more visual guides on the display adjacent the one or more areas on the multi-touch input bezel designated for the one or more controls , the processing circuitry obtaining touch data from the multi-touch input bezel and determining if at least one of the one or more controls that corresponds to the designated areas is invoked by the obtained touch data .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (designating one) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20060238517A1
CLAIM 25
. An electronic device , comprising : a display positioned on the electronic device and having a perimeter ;
a multi-touch input bezel positioned on the electronic device substantially around the perimeter of the display ;
and processing circuitry operatively connected to the display and the multi-touch input bezel , the processing circuitry designating one (touchscreen layer) or more areas on the multi-touch input bezel for one or more controls used to operate the electronic device , the processing circuitry generating one or more visual guides corresponding to the one or more controls and sending visual data to the display for displaying the one or more visual guides on the display adjacent the one or more areas on the multi-touch input bezel designated for the one or more controls , the processing circuitry obtaining touch data from the multi-touch input bezel and determining if at least one of the one or more controls that corresponds to the designated areas is invoked by the obtained touch data .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (surface position) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20060238517A1
CLAIM 1
. An electronic device , comprising : a display positioned on the electronic device and having a perimeter ;
at least one touch sensitive surface position (holding pattern) ed on the electronic device adjacent at least a portion of the perimeter of the display ;
and processing circuitry operatively connected to the display and to the at least one touch sensitive surface , the processing circuitry configured to : designate at least one area of the at least one touch sensitive surface for at least one control ;
generate at least one visual guide for the at least one control ;
and present the at least one visual guide for display at a location on the display adjacent the at least one area designated for the at least one control .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (surface position) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20060238517A1
CLAIM 1
. An electronic device , comprising : a display positioned on the electronic device and having a perimeter ;
at least one touch sensitive surface position (holding pattern) ed on the electronic device adjacent at least a portion of the perimeter of the display ;
and processing circuitry operatively connected to the display and to the at least one touch sensitive surface , the processing circuitry configured to : designate at least one area of the at least one touch sensitive surface for at least one control ;
generate at least one visual guide for the at least one control ;
and present the at least one visual guide for display at a location on the display adjacent the at least one area designated for the at least one control .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20070291008A1

Filed: 2006-06-16     Issued: 2007-12-20

Inverted direct touch sensitive input devices

(Original Assignee) Mitsubishi Electric Research Laboratories Inc     (Current Assignee) Mitsubishi Electric Research Laboratories Inc

Daniel Wigdor, Darren Leigh, Clifton Forlines, Chia Shen, John C. Barnwell, Samuel E. Shipman
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (current touch) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20070291008A1
CLAIM 10
. The device of claim 1 , in which multiple concurrent touch (first set) es by multiple users are uniquely identified with the multiple users .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (display images, display unit) using predefined set of gestures to toggle a full-screen mode .
US20070291008A1
CLAIM 1
. A direct touch-sensitive input device , comprising : a display surface configured to display images (status bar visibility) on a front of the display surface ;
and a first direct touch-sensitive surface mounted on a back of the display surface , in which the display surface and the direct touch surface are geometrically coincident .

US20070291008A1
CLAIM 4
. The device of claim 2 , in which the means for displaying is a plasma display unit (status bar visibility) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2006094308A2

Filed: 2006-03-03     Issued: 2006-09-08

Multi-functional hand-held device

(Original Assignee) Apple Computer, Inc.     

Steve P. Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (current touch) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (including one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2006094308A2
CLAIM 1
. A hand-held electronic device , comprising : a multi-touch input surface ;
and a processing unit operatively connected to said multi-touch input surface , said processing unit capable of receiving a plurality of concurrent touch (first set) inputs from a user via said multi-touch input surface and discriminating a user requested action from the touch inputs ;
and a display device operatively coupled to the processing unit and configured to present a user interface .

WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one (second set) or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen (electronic device status display panel) . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen (electronic device status display panel) . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2006094308A2
CLAIM 2
. A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , handtop , Internet terminal , GPS receiver , and remote control . 3 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is capable of reconfiguring or adapting the user interface based on the state or mode of said hand-held electronic device . 4 . A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display . 5 . A hand-held electronic device as recited in claim 1 , wherein said multi- touch input surface is integral with said display device . 6 . A hand-held electronic device as recited in claim 5 , wherein said handheld electronic device is includes two or more of the following device functionalities : PDA , mobile phone , music player , camera , video player , game player , camera , handtop , Internet terminal , GPS receiver , and remote control . 7 . A hand-held electronic device as recited in claim 5 , wherein said multi- touch input surface serves as the primary input means necessary to interact with said hand-held electronic device . 8 . A hand-held electronic device as recited in claim 7 , wherein said handheld electronic device includes cross-functional physical buttons . 9 . A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multipoint capacitive touch screen . 10 . A hand-held electronic device as recited in claim 9 , wherein said handheld electronic device is operable to recognize touch gestures applied to said multi-touch input surface wherein the touch gestures are used to control aspects of said hand-held electronic device . 11 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device is operable to receive simultaneous inputs from different inputs devices and perform actions based on the simultaneous inputs . 12 . A hand-held electronic device as recited in claim 1 , wherein signals from various input devices of said hand-held electronic device have different meanings or outputs based on a mode of said hand-held electronic device . 13 . A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one or more virtual controls for user interaction . 14 . A hand-held electronic device as recited in claim 13 , wherein at least one of the standard region and the control region are user configurable . 15 . A hand-held electronic device as recited in claim 1 , wherein said display device comprises a force sensitive display , said force sensitive display producing one or more input signals to be generated when force is exerted thereon . 16 . A hand-held electronic device as recited in claims 15 , wherein said force sensitive display senses a force indication , and wherein said hand-held electronic device distinguishes the force indication into at least a first touch type and a second touch type . 17 . A hand-held electronic device as recited in any of claim 16 , wherein the first touch type corresponds to a light touch , and the second touch type corresponds to a hard touch . 18 . A hand-held electronic device as recited in claim 1 , wherein said handheld electronic device provides audio or tactile feedback to a user based on user inputs made with respect to said hand-held electronic device . 19 . A hand-held electronic device as recited in claim 1 , wherein hand-held electronic device is configurable to actively look for signals in a surrounding environment , and change user interface or mode of operation based on the signals . 20 . A hand-held computing device , comprising : a housing ;
a display arrangement positioned within said housing , said display arrangement including a display and a touch screen ;
and a device configured to generate a signal when some portion of said display arrangement is moved . 21 . A hand-held electronic device , comprising : a touch screen ;
and a processing unit operatively connected to said touch screen , said processing unit concurrently receives a plurality of touch inputs from a user via said touch screen and discriminates a user requested action from the touch inputs , wherein said touch screen serves as the primary input means necessary to interact with said hand-held electronic device . 22 . A hand-held electronic device as recited in claim 21 , wherein said media device operates as one or more of a mobile phone , a PDA , a media player , a camera , a same player , a handtop , an Internet terminal , a GPS receiver , or a remote controller .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
KR20070110114A

Filed: 2006-03-03     Issued: 2007-11-15

다기능 휴대용 장치

(Original Assignee) 애플 인크.     

스티브 피. 호텔링
US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction (입력들을, indication) (입력들을, indication) in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
KR20070110114A
CLAIM 1
다중-터치(multi-touch) 입력면 ;
상기 다중-터치 입력면에 동작 가능하게(operatively) 연결되며 , 상기 다중-터치 입력면을 통해 사용자로부터 동시 발생의 복수 터치 입력들을 (s response instruction, response instruction) 수신할 수 있고 상기 터치 입력들로부터 사용자가 요청하는(user requested) 액션을 식별할 수 있는 프로세싱 유닛 ;
및 상기 프로세싱 유닛에 동작 가능하게 연결되고 사용자 인터페이스를 표시하도록 구성된 디스플레이 장치 를 포함하는 휴대용 전자 장치 .

KR20070110114A
CLAIM 16
제15항에 있어서 , 상기 힘 감지 디스플레이는 힘의 표시(force indication (s response instruction, response instruction) )를 감지하고 , 상기 휴대용 전자 장치는 상기 힘의 표시를 적어도 제1 터치 유형 및 제2 터치 유형으로 구별하는 휴대용 전자 장치 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20060197753A1

Filed: 2006-03-03     Issued: 2006-09-07

Multi-functional hand-held device

(Original Assignee) Apple Computer Inc     (Current Assignee) Apple Inc

Steven Hotelling
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (current touch) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (including one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20060197753A1
CLAIM 1
. A hand-held electronic device , comprising : a multi-touch input surface ;
and a processing unit operatively connected to said multi-touch input surface , said processing unit capable of receiving a plurality of concurrent touch (first set) inputs from a user via said multi-touch input surface and discriminating a user requested action from the touch inputs ;
and a display device operatively coupled to the processing unit and configured to present a user interface .

US20060197753A1
CLAIM 13
. A hand-held electronic device as recited in claim 1 , wherein said user interface comprises a standard region and a control region the standard region being used to display data , and the control region including one (second set) or more virtual controls for user interaction .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20060197753A1
CLAIM 9
. A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multi-point capacitive touch screen (electronic device status display panel) .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20060197753A1
CLAIM 9
. A hand-held electronic device as recited in claim 5 , wherein said the multi-touch input surface integral with the display device is a multi-point capacitive touch screen (electronic device status display panel) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20060197753A1
CLAIM 4
. A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20060197753A1
CLAIM 4
. A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (full screen) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20060197753A1
CLAIM 4
. A hand-held electronic device as recited in claim 3 , wherein said display device is a full screen (touchscreen display) display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US8018440B2

Filed: 2005-12-30     Issued: 2011-09-13

Unintentional touch rejection

(Original Assignee) Microsoft Corp     (Current Assignee) Microsoft Technology Licensing LLC

Reed L. Townsend, Alexander J. Kolmykov-Zotov, Steven P. Dodge, Bryan D. Scott
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (first character) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US8018440B2
CLAIM 5
. The method of claim 2 , wherein the generating of the set of parameters comprises : recording values for a set of characteristics associated with the touch ;
determining a rate of change of a first character (touchscreen layer, touchscreen display) istic in the set of characteristics ;
and providing the rate of change of the first characteristic as one of the parameters in the set of parameters .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (first character) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US8018440B2
CLAIM 5
. The method of claim 2 , wherein the generating of the set of parameters comprises : recording values for a set of characteristics associated with the touch ;
determining a rate of change of a first character (touchscreen layer, touchscreen display) istic in the set of characteristics ;
and providing the rate of change of the first characteristic as one of the parameters in the set of parameters .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (first character) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US8018440B2
CLAIM 5
. The method of claim 2 , wherein the generating of the set of parameters comprises : recording values for a set of characteristics associated with the touch ;
determining a rate of change of a first character (touchscreen layer, touchscreen display) istic in the set of characteristics ;
and providing the rate of change of the first characteristic as one of the parameters in the set of parameters .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (first character) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (low probability) to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US8018440B2
CLAIM 5
. The method of claim 2 , wherein the generating of the set of parameters comprises : recording values for a set of characteristics associated with the touch ;
determining a rate of change of a first character (touchscreen layer, touchscreen display) istic in the set of characteristics ;
and providing the rate of change of the first characteristic as one of the parameters in the set of parameters .

US8018440B2
CLAIM 15
. A method of rejecting unintentional input to a touch-sensitive display , the method comprising : receiving a first touch on the touch-sensitive display ;
providing a first level of filtering to the first touch ;
generating a first reliability value based in part on historical variations in one or more of a set of parameters of the touch ;
determining a first confidence level based on the first reliability value and an activity context , wherein the activity context comprises a user pattern which comprises a consistency demonstrated by a user using a stylus to enter data in a window of an application ;
determining that the first reliability value is less than a particular level , the first reliability value indicating a low probability (usage frequency) that the first touch is intended ;
determining that the first touch is a gesture specific to the application in response to determining that the first reliability level is less than the particular level ;
and transmitting an input related to the first touch to the application based on determining that the first touch is a gesture specific to the application .

US8018440B2
CLAIM 17
. The method of claim 16 , wherein the second touch is being applied at substantially the same time (holding pattern) as the first touch .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (first character) , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (low probability) to define a personalized holding pattern (same time) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US8018440B2
CLAIM 2
. The method of claim 1 , wherein the receiving a touch on the touch-sensitive display comprises : detecting the touch of a user' ;
s hand (s hand) on a first area of the touch-sensitive surface ;
and generating the set of parameters related to the touch .

US8018440B2
CLAIM 5
. The method of claim 2 , wherein the generating of the set of parameters comprises : recording values for a set of characteristics associated with the touch ;
determining a rate of change of a first character (touchscreen layer, touchscreen display) istic in the set of characteristics ;
and providing the rate of change of the first characteristic as one of the parameters in the set of parameters .

US8018440B2
CLAIM 15
. A method of rejecting unintentional input to a touch-sensitive display , the method comprising : receiving a first touch on the touch-sensitive display ;
providing a first level of filtering to the first touch ;
generating a first reliability value based in part on historical variations in one or more of a set of parameters of the touch ;
determining a first confidence level based on the first reliability value and an activity context , wherein the activity context comprises a user pattern which comprises a consistency demonstrated by a user using a stylus to enter data in a window of an application ;
determining that the first reliability value is less than a particular level , the first reliability value indicating a low probability (usage frequency) that the first touch is intended ;
determining that the first touch is a gesture specific to the application in response to determining that the first reliability level is less than the particular level ;
and transmitting an input related to the first touch to the application based on determining that the first touch is a gesture specific to the application .

US8018440B2
CLAIM 17
. The method of claim 16 , wherein the second touch is being applied at substantially the same time (holding pattern) as the first touch .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2005008444A2

Filed: 2004-07-14     Issued: 2005-01-27

System and method for a portbale multimedia client

(Original Assignee) Matt Pallakoff     

Matt Pallakoff
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device, sensor means) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set (second function) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (remote device) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 16
. The portable electronic display device of claim 12 , wherein said display is a touch screen display , whereby said portable electronic display device controller unit responds to the touching of an exposed portion of said touch screen display by controlling a second function (first set) ofthe portable electronic display device , said second function being different than said first function .

WO2005008444A2
CLAIM 35
. A hand-held remote control comprising : 15 means for remotely controlling at least one remote device (second set) , said remote controlling means communicating control commands to said remote device ;
touch sensitive means for receiving patterns of touch by a human ;
and 20 means for detecting the patterns of touch impressed upon said touch sensitive means , the delected the patterns of touch being communicated as control commands to said remote controlling means . (

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


WO2005008444A2
CLAIM 46
. The display controlling method of claim 43 . wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
WO2005008444A2
CLAIM 46
. The display controlling method of claim 43 . wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device, sensor means) for the gestural hardware on how a multi-touch input will be processed .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (display control) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
WO2005008444A2
CLAIM 44
. The display control (operating system status bar) ling method of claim 43 , further comprising the steps of : 5 detecting a second pattern of touch on at least one portion of said touch sensor ;
and responding to the detection of said second pattern of touch by controlling a second function ofthe portable electronic display device or by changing the content displayed in said display .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (electronic device, sensor means) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


US9645663B2
CLAIM 13
. The electronic device (electronic device, sensor means) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


US9645663B2
CLAIM 14
. An electronic device (electronic device, sensor means) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


WO2005008444A2
CLAIM 46
. The display controlling method of claim 43 . wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 15
. The electronic device (electronic device, sensor means) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area (service provider) comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


WO2005008444A2
CLAIM 38
. A method of implementing a mobile information service , the method comprising the steps of : a . providing to a customer a wireless enabled , hand-held electronic display device of claim 32 ;
b . the service provider (user input area) establishing wireless communications between said wireless hand-held device and an information server , the established wireless communications being capable of communicating information from the information server to said wireless hand-held device ;
and c . the service provider charging the customer an access fee .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2005008444A2
CLAIM 1
1 . A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3J inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% , range of 144 effective ppi .

WO2005008444A2
CLAIM 36
. A portable electronic display device for displaying information , the display device 25 comprising : means for displaying information content : touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and 30 means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means . ' ;


WO2005008444A2
CLAIM 38
. A method of implementing a mobile information service , the method comprising the steps of : a . providing to a customer a wireless enabled , hand-held electronic display device of claim 32 ;
b . the service provider establishing wireless communications between said wireless hand (s hand) -held device and an information server , the established wireless communications being capable of communicating information from the information server to said wireless hand-held device ;
and c . the service provider charging the customer an access fee .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20050012723A1

Filed: 2004-07-14     Issued: 2005-01-20

System and method for a portable multimedia client

(Original Assignee) MOVE MOBILE SYSTEMS Inc     (Current Assignee) MOVE MOBILE SYSTEMS Inc

Matt Pallakoff
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device, sensor means) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set (second function) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (remote device) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 16
. The portable electronic display device of claim 12 , wherein said display is a touch screen display , whereby said portable electronic display device controller unit responds to the touching of an exposed portion of said touch screen display by controlling a second function (first set) of the portable electronic display device , said second function being different than said first function .

US20050012723A1
CLAIM 35
. A hand-held remote control comprising : means for remotely controlling at least one remote device (second set) , said remote controlling means communicating control commands to said remote device ;
touch sensitive means for receiving patterns of touch by a human ;
and means for detecting the patterns of touch impressed upon said touch sensitive means , the detected the patterns of touch being communicated as control commands to said remote controlling means .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US20050012723A1
CLAIM 46
. The display controlling method of claim 43 , wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US20050012723A1
CLAIM 46
. The display controlling method of claim 43 , wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device, sensor means) for the gestural hardware on how a multi-touch input will be processed .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (display control) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20050012723A1
CLAIM 44
. The display control (operating system status bar) ling method of claim 43 , further comprising the steps of : detecting a second pattern of touch on at least one portion of said touch sensor ;
and responding to the detection of said second pattern of touch by controlling a second function of the portable electronic display device or by changing the content displayed in said display .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (electronic device, sensor means) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US9645663B2
CLAIM 13
. The electronic device (electronic device, sensor means) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US9645663B2
CLAIM 14
. An electronic device (electronic device, sensor means) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US20050012723A1
CLAIM 46
. The display controlling method of claim 43 , wherein said first pattern of touch is touch sliding along said touch sensor and said first function is to scroll the displayed content in a first direction (first mode) corresponding to the direction the finger is sliding in .

US9645663B2
CLAIM 15
. The electronic device (electronic device, sensor means) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area (service provider) comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US20050012723A1
CLAIM 38
. A method of implementing a mobile information service , the method comprising the steps of : a . providing to a customer a wireless enabled , hand-held electronic display device of claim 32 ;
b . the service provider (user input area) establishing wireless communications between said wireless hand-held device and an information server , the established wireless communications being capable of communicating information from the information server to said wireless hand-held device ;
and c . the service provider charging the customer an access fee .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device, sensor means) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (s hand) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20050012723A1
CLAIM 11
. A portable electronic device (electronic device, electronic device status display panel) for displaying information , the device comprising : an enclosure , which is effectively sized to be inclusively within a plus or minus 15% range of 4 . 6 inches in the first dimension and inclusively within a plus or minus 15% range of 3 . 1 inches in the second dimension ;
and a display joined to said enclosure such that an active surface of said display is visible , said display having an effective pixel count in a first dimension inclusively within a plus or minus 15% range of 600 effective pixels , and an effective pixel count in a second dimension inclusively within a plus or minus 15% range of 400 effective pixels , and an effective pixel density inclusively within a plus or minus 15% range of 144 effective ppi .

US20050012723A1
CLAIM 36
. A portable electronic display device for displaying information , the display device comprising : means for displaying information content ;
touch sensor means (electronic device, electronic device status display panel) for detecting a first pattern of touch ;
and means for detecting said first pattern of touch and responding by either controlling a first function of the portable electronic display device or by changing the content displayed in said display means .

US20050012723A1
CLAIM 38
. A method of implementing a mobile information service , the method comprising the steps of : a . providing to a customer a wireless enabled , hand-held electronic display device of claim 32 ;
b . the service provider establishing wireless communications between said wireless hand (s hand) -held device and an information server , the established wireless communications being capable of communicating information from the information server to said wireless hand-held device ;
and c . the service provider charging the customer an access fee .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20030221876A1

Filed: 2002-05-31     Issued: 2003-12-04

Instrument-activated sub-surface computer buttons and system and method incorporating same

(Original Assignee) Hewlett Packard Development Co LP     (Current Assignee) Hewlett Packard Development Co LP

Paul Doczy, Stacy Wolff, Steven Homer, Mark Solomon
US9645663B2
CLAIM 1
. A display system for an electronic device (triggering signal) comprising : a touch-sensitive display screen (display screen) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (triggering signal) of response to a first set (different one) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US20030221876A1
CLAIM 20
. The sub-display-screen button of claim 10 , comprising a plurality of different one (first set) s of the button disposed under the outer surface .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (triggering signal) of response in the active touchscreen region .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (display screen) .
US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (display screen) .
US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (triggering signal) for the gestural hardware on how a multi-touch input will be processed .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (display screen) comprises an electronic device (triggering signal) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US9645663B2
CLAIM 13
. The electronic device (triggering signal) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US9645663B2
CLAIM 14
. An electronic device (triggering signal) comprising : a handheld interactive electronic device having a virtual bezel display screen (display screen) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (triggering signal) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US9645663B2
CLAIM 15
. The electronic device (triggering signal) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (display screen) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US20030221876A1
CLAIM 3
. The remotely-activated sub-surface button of claim 1 , wherein the surface comprises at least one layer of a display screen (display screen) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (triggering signal) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (triggering signal) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (triggering signal) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20030221876A1
CLAIM 1
. A remotely-activated sub-surface button for an electronic device , comprising : a button disposed under a surface of the electronic device , wherein the button comprises a triggering signal (electronic device, first mode, screen mode) system communicative with an electronic pointing device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US5606346A

Filed: 1995-06-06     Issued: 1997-02-25

Coordinate input device

(Original Assignee) Panasonic Corp     (Current Assignee) Panasonic Corp

Tsutomu Kai, Masahito Matsunami
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (oscillating frequency) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US5606346A
CLAIM 1
. A coordinate input device comprising : a display panel including : a plurality of row electrodes and a plurality of column electrodes arranged in a matrix ;
and a plurality of display elements , each connected to one of the crossings of the plurality of row electrodes and the plurality of column electrodes ;
row electrode driving means for , during a display period and a row coordinate detection period , sequentially supplying row electrode driving signals to the plurality of row electrodes to effect scanning of the plurality of row electrodes , and for fixing the potential of the plurality of row electrodes constant during a column coordinate detection period ;
column electrode driving means for sequentially supplying display signals to the plurality of column electrodes during the display period , for sequentially supplying column electrode driving signals to the plurality of column electrodes to effect scanning of the plurality of column electrodes during the column coordinate detection period , and for fixing the potential of the plurality of column electrodes constant during the row coordinate detection period ;
coordinate detection means for detecting a coordinate value of an indicated position on the display panel ;
a back light disposed on the back side of the display panel for illuminating the display panel ;
an inverter circuit for generating a back light driving signal for driving the back light ;
control means for effecting switching among the display period , the row coordinate detection period , and the column coordinate detection period , the control means controlling the row electrode driving means to supply the row electrode driving signals in Synchronization with the back light driving signals during the row coordinate detection period , and controlling the column electrode driving means to supply the column electrode driving signals in synchronization with the back light driving signals during the column coordinate detection period ;
and further comprising status detection means for detecting the oscillating frequency (first mode) and the amplitude of the back light driving signals correction value calculating means for calculating a correction value of the indicated position based on the detected oscillating frequency and the amplitude , correction means for receiving the coordinate value of the indicated position from the coordinate detection means and the correction value from the correction value calculating means and for subtracting the correction value from the coordinate value .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (oscillating frequency) of response in the active touchscreen region .
US5606346A
CLAIM 1
. A coordinate input device comprising : a display panel including : a plurality of row electrodes and a plurality of column electrodes arranged in a matrix ;
and a plurality of display elements , each connected to one of the crossings of the plurality of row electrodes and the plurality of column electrodes ;
row electrode driving means for , during a display period and a row coordinate detection period , sequentially supplying row electrode driving signals to the plurality of row electrodes to effect scanning of the plurality of row electrodes , and for fixing the potential of the plurality of row electrodes constant during a column coordinate detection period ;
column electrode driving means for sequentially supplying display signals to the plurality of column electrodes during the display period , for sequentially supplying column electrode driving signals to the plurality of column electrodes to effect scanning of the plurality of column electrodes during the column coordinate detection period , and for fixing the potential of the plurality of column electrodes constant during the row coordinate detection period ;
coordinate detection means for detecting a coordinate value of an indicated position on the display panel ;
a back light disposed on the back side of the display panel for illuminating the display panel ;
an inverter circuit for generating a back light driving signal for driving the back light ;
control means for effecting switching among the display period , the row coordinate detection period , and the column coordinate detection period , the control means controlling the row electrode driving means to supply the row electrode driving signals in Synchronization with the back light driving signals during the row coordinate detection period , and controlling the column electrode driving means to supply the column electrode driving signals in synchronization with the back light driving signals during the column coordinate detection period ;
and further comprising status detection means for detecting the oscillating frequency (first mode) and the amplitude of the back light driving signals correction value calculating means for calculating a correction value of the indicated position based on the detected oscillating frequency and the amplitude , correction means for receiving the coordinate value of the indicated position from the coordinate detection means and the correction value from the correction value calculating means and for subtracting the correction value from the coordinate value .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (oscillating frequency) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US5606346A
CLAIM 1
. A coordinate input device comprising : a display panel including : a plurality of row electrodes and a plurality of column electrodes arranged in a matrix ;
and a plurality of display elements , each connected to one of the crossings of the plurality of row electrodes and the plurality of column electrodes ;
row electrode driving means for , during a display period and a row coordinate detection period , sequentially supplying row electrode driving signals to the plurality of row electrodes to effect scanning of the plurality of row electrodes , and for fixing the potential of the plurality of row electrodes constant during a column coordinate detection period ;
column electrode driving means for sequentially supplying display signals to the plurality of column electrodes during the display period , for sequentially supplying column electrode driving signals to the plurality of column electrodes to effect scanning of the plurality of column electrodes during the column coordinate detection period , and for fixing the potential of the plurality of column electrodes constant during the row coordinate detection period ;
coordinate detection means for detecting a coordinate value of an indicated position on the display panel ;
a back light disposed on the back side of the display panel for illuminating the display panel ;
an inverter circuit for generating a back light driving signal for driving the back light ;
control means for effecting switching among the display period , the row coordinate detection period , and the column coordinate detection period , the control means controlling the row electrode driving means to supply the row electrode driving signals in Synchronization with the back light driving signals during the row coordinate detection period , and controlling the column electrode driving means to supply the column electrode driving signals in synchronization with the back light driving signals during the column coordinate detection period ;
and further comprising status detection means for detecting the oscillating frequency (first mode) and the amplitude of the back light driving signals correction value calculating means for calculating a correction value of the indicated position based on the detected oscillating frequency and the amplitude , correction means for receiving the coordinate value of the indicated position from the coordinate detection means and the correction value from the correction value calculating means and for subtracting the correction value from the coordinate value .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120266072A1

Filed: 2012-06-27     Issued: 2012-10-18

Method And System For A Digital Diary System

(Original Assignee) Broadcom Corp     (Current Assignee) Avago Technologies International Sales Pte Ltd

Jeyhan Karaoguz
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (receiving user input, capture module) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120266072A1
CLAIM 11
. A wireless communication device comprising : memory to store data and instructions ;
a processor in data communication with the memory and responsive to the stored data and instructions for controlling the wireless communication device ;
a diary management module operative to organize digital diary information for accessing , updating , outputting and processing of the digital diary information , the digital diary information including a plurality of digital diary records stored in the memory by the diary management module ;
a transaction capture module (user input, user input area) operative to automatically capture transactions that a user or the wireless communication device may conduct ;
a user interface module operative to accept user input and provide user information to the user of the wireless communication device , the user input for use in a digital diary record of the plurality of digital diary records and/or for linking various digital diary information according to user-specified factors , the user input controlling the user interface module to modify a digital diary record stored in the memory in response to the user input .

US20120266072A1
CLAIM 17
. A method comprising : at a wireless communication device , receiving user input (user input, user input area) at a user interface to define digital diary information ;
storing the digital diary information as one or more digital diary records in a memory of the wireless communication device ;
determining a location associated with the user input ;
storing location information for the determined location in a digital diary record ;
and automatically updating the digital diary information when some or all of the digital diary information has become obsolete or new information associated with digital diary information is available .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (wireless communication module) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120266072A1
CLAIM 16
. The wireless communication module (operating system status bar) of claim 11 further comprising : a location capture module operable to acquire location information , wherein the wireless communication device is operative to link the acquired location information with items of the digital diary information .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120266072A1
CLAIM 8
. The method of claim 7 further comprising : accept user input via a touch screen (electronic device status display panel) of the user interface ;
and modifying a digital diary record stored in the memory in response to the user input .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120266072A1
CLAIM 8
. The method of claim 7 further comprising : accept user input via a touch screen (electronic device status display panel) of the user interface ;
and modifying a digital diary record stored in the memory in response to the user input .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (receiving user input, capture module) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120266072A1
CLAIM 11
. A wireless communication device comprising : memory to store data and instructions ;
a processor in data communication with the memory and responsive to the stored data and instructions for controlling the wireless communication device ;
a diary management module operative to organize digital diary information for accessing , updating , outputting and processing of the digital diary information , the digital diary information including a plurality of digital diary records stored in the memory by the diary management module ;
a transaction capture module (user input, user input area) operative to automatically capture transactions that a user or the wireless communication device may conduct ;
a user interface module operative to accept user input and provide user information to the user of the wireless communication device , the user input for use in a digital diary record of the plurality of digital diary records and/or for linking various digital diary information according to user-specified factors , the user input controlling the user interface module to modify a digital diary record stored in the memory in response to the user input .

US20120266072A1
CLAIM 17
. A method comprising : at a wireless communication device , receiving user input (user input, user input area) at a user interface to define digital diary information ;
storing the digital diary information as one or more digital diary records in a memory of the wireless communication device ;
determining a location associated with the user input ;
storing location information for the determined location in a digital diary record ;
and automatically updating the digital diary information when some or all of the digital diary information has become obsolete or new information associated with digital diary information is available .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (receiving user input, capture module) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120266072A1
CLAIM 11
. A wireless communication device comprising : memory to store data and instructions ;
a processor in data communication with the memory and responsive to the stored data and instructions for controlling the wireless communication device ;
a diary management module operative to organize digital diary information for accessing , updating , outputting and processing of the digital diary information , the digital diary information including a plurality of digital diary records stored in the memory by the diary management module ;
a transaction capture module (user input, user input area) operative to automatically capture transactions that a user or the wireless communication device may conduct ;
a user interface module operative to accept user input and provide user information to the user of the wireless communication device , the user input for use in a digital diary record of the plurality of digital diary records and/or for linking various digital diary information according to user-specified factors , the user input controlling the user interface module to modify a digital diary record stored in the memory in response to the user input .

US20120266072A1
CLAIM 17
. A method comprising : at a wireless communication device , receiving user input (user input, user input area) at a user interface to define digital diary information ;
storing the digital diary information as one or more digital diary records in a memory of the wireless communication device ;
determining a location associated with the user input ;
storing location information for the determined location in a digital diary record ;
and automatically updating the digital diary information when some or all of the digital diary information has become obsolete or new information associated with digital diary information is available .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (receiving user input, capture module) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120266072A1
CLAIM 11
. A wireless communication device comprising : memory to store data and instructions ;
a processor in data communication with the memory and responsive to the stored data and instructions for controlling the wireless communication device ;
a diary management module operative to organize digital diary information for accessing , updating , outputting and processing of the digital diary information , the digital diary information including a plurality of digital diary records stored in the memory by the diary management module ;
a transaction capture module (user input, user input area) operative to automatically capture transactions that a user or the wireless communication device may conduct ;
a user interface module operative to accept user input and provide user information to the user of the wireless communication device , the user input for use in a digital diary record of the plurality of digital diary records and/or for linking various digital diary information according to user-specified factors , the user input controlling the user interface module to modify a digital diary record stored in the memory in response to the user input .

US20120266072A1
CLAIM 17
. A method comprising : at a wireless communication device , receiving user input (user input, user input area) at a user interface to define digital diary information ;
storing the digital diary information as one or more digital diary records in a memory of the wireless communication device ;
determining a location associated with the user input ;
storing location information for the determined location in a digital diary record ;
and automatically updating the digital diary information when some or all of the digital diary information has become obsolete or new information associated with digital diary information is available .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature (respective user) from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (receiving user input, capture module) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120266072A1
CLAIM 11
. A wireless communication device comprising : memory to store data and instructions ;
a processor in data communication with the memory and responsive to the stored data and instructions for controlling the wireless communication device ;
a diary management module operative to organize digital diary information for accessing , updating , outputting and processing of the digital diary information , the digital diary information including a plurality of digital diary records stored in the memory by the diary management module ;
a transaction capture module (user input, user input area) operative to automatically capture transactions that a user or the wireless communication device may conduct ;
a user interface module operative to accept user input and provide user information to the user of the wireless communication device , the user input for use in a digital diary record of the plurality of digital diary records and/or for linking various digital diary information according to user-specified factors , the user input controlling the user interface module to modify a digital diary record stored in the memory in response to the user input .

US20120266072A1
CLAIM 15
. The wireless communication device of claim 14 wherein the multi-user management module is configured to tailor digital diary functionality of the wireless communication device to each respective user (heat signature) of the plurality of users of the wireless communication device .

US20120266072A1
CLAIM 17
. A method comprising : at a wireless communication device , receiving user input (user input, user input area) at a user interface to define digital diary information ;
storing the digital diary information as one or more digital diary records in a memory of the wireless communication device ;
determining a location associated with the user input ;
storing location information for the determined location in a digital diary record ;
and automatically updating the digital diary information when some or all of the digital diary information has become obsolete or new information associated with digital diary information is available .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2530677A2

Filed: 2012-05-31     Issued: 2012-12-05

Method and apparatus for controlling a display of multimedia content using a timeline-based interface

(Original Assignee) Samsung Electronics Co Ltd     (Current Assignee) Samsung Electronics Co Ltd

Sung-Jae Hwang
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (time line) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2530677A2
CLAIM 7
The method of claim 1 , characterized in that the zoom-displaying comprises adjusting a slide position adjustment unit on the time line (first set) depending on a size of the content .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (display control) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
EP2530677A2
CLAIM 15
The apparatus of claim 10 , characterized in that the display control (operating system status bar) ler adjusts a slide position adjustment unit depending on a size of the content .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2530677A2
CLAIM 1
A method for controlling a playback of content displayed on a touch screen (electronic device status display panel) using a timeline , characterized by : detecting a selection at a particular location on the timeline for a predetermined time period ;
and selectively zoom-displaying an area around the detected location on the timeline where the selection is detected .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2530677A2
CLAIM 1
A method for controlling a playback of content displayed on a touch screen (electronic device status display panel) using a timeline , characterized by : detecting a selection at a particular location on the timeline for a predetermined time period ;
and selectively zoom-displaying an area around the detected location on the timeline where the selection is detected .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120229403A1

Filed: 2012-05-23     Issued: 2012-09-13

Image display that moves physical objects and causes tactile sensation

(Original Assignee) Koninklijke Philips NV     (Current Assignee) Koninklijke Philips NV

Nicolas De Jong, Elmo Marcus Attila DIEDERIKS, SR., Murray Fulton Gillies, Jurgen Jean Louis Hoppenbrouwers, Johannes Henricus Maria Korst, Thomas Caspar Kraan, Rogier Winters
US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (determined direction) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120229403A1
CLAIM 9
. An image display configured to move an object near a surface of the image display , said image display comprising : a pixel array having a first plurality of pixels , said first plurality of pixels when selectively activation in response to a display signal forming a display image corresponding to the display signal ;
and an actuator matrix including a second plurality of actuators interspersed between the first plurality of pixels in said pixel array wherein each of said actuators in said second plurality of actuators comprises a ball bearing assembly (1 - 1) including a ball bearing (1 - 10) and a plurality of magnetic drivers for causing said ball bearing to rotate in a predetermined direction (touchscreen area) in dependence on a movement signal applied to said magnetic drivers , the rotation of said ball bearing selectively causing an object at a surface of the image display to be moved in a specified direction by the rotation of the ball bearing .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120194461A1

Filed: 2012-04-09     Issued: 2012-08-02

Advanced touch control of interactive map viewing via finger angle using a high dimensional touchpad (hdtp) touch user interface

(Original Assignee) Lester F. Ludwig     (Current Assignee) NRI R&d Patent Licensing LLC

Seung E. Lim
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (right position) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120194461A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (measured data) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120194461A1
CLAIM 1
. A method for controlling an interactive map application , the method comprising : configuring a user interface touch sensor to be responsive to at least one angle of contact with at least one finger , the finger belonging to a human user of a computing device and the user interface touch sensor in communication with an operating system of the computing device ;
measuring at least one change in at least one angle of the position of the finger with respect to the surface of the touch sensor to produce measured data (operating system status bar) ;
performing real-time calculations on the measured data to produce a measured-angle value ;
and using the measured-angle value to control the value of at least one user interface parameter of an interactive map application ;
wherein at least one aspect of the interactive map application changes responsive to the angle of the position of the finger with respect to the surface of the touch sensor .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (right position) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120194461A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (right position) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120194461A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120194462A1

Filed: 2012-04-09     Issued: 2012-08-02

Advanced touch control of interactive immersive imaging applications via finger angle using a high dimensional touchpad (hdtp) touch user interface

(Original Assignee) Lester F. Ludwig     (Current Assignee) NRI R&d Patent Licensing LLC

Seung E. Lim
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (right position) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120194462A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (measured data) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120194462A1
CLAIM 1
. A method for controlling an interactive immersive imaging application , the method comprising : configuring a user interface touch sensor to be responsive to at least one angle of contact with at least one finger , the finger belonging to a human user of a computing device and the user interface touch sensor in communication with an operating system of the computing device ;
measuring at least one change in at least one angle of the position of the finger with respect to the surface of the touch sensor to produce measured data (operating system status bar) ;
performing real-time calculations on the measured data to produce a measured-angle value ;
and using the measured-angle value to control the value of at least one user interface parameter of an interactive immersive imaging application ;
wherein at least one aspect of the interactive immersive imaging application changes responsive to the angle of the position of the finger with respect to the surface of the touch sensor .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (right position) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120194462A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (right position) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120194462A1
CLAIM 10
. The method of claim 1 , wherein the user interface touch sensor is additionally configured to be responsive to the left-right position (touchscreen layer, touchscreen area) of the finger on the touch sensor .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20130002610A1

Filed: 2012-03-28     Issued: 2013-01-03

Touch sensitive display device

(Original Assignee) Hon Hai Precision Industry Co Ltd     (Current Assignee) Hon Hai Precision Industry Co Ltd

Hsien-Lung Ho, Chiu-Hsiung Lin
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20130002610A1
CLAIM 1
. A touch sensitive display device comprising : an interferometric modulator display panel comprising a plurality of pixel units , each of the pixel units comprising : a fixed mirror including a reflective surface ;
a transmovable mirror ;
at least one spacer arranged between the fixed mirror and the movable mirror , the movable mirror spaced an adjustable distance from the fixed mirror , the movable mirror being configured to reflect a first portion (first portion) of incident light and to allow a second portion (second portion, usage frequency) of the incident light to pass therethrough ;
at least one pressure sensor fixed on the movable mirror and configured for sensing depression of the movable mirror caused by a touch thereon , and generating a signal associated therewith ;
and a processor electrically connected to the pressure sensors of the pixel units and configured to determine a touch position according to the signals from the corresponding sensors .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (first portion) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (second portion) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20130002610A1
CLAIM 1
. A touch sensitive display device comprising : an interferometric modulator display panel comprising a plurality of pixel units , each of the pixel units comprising : a fixed mirror including a reflective surface ;
a transmovable mirror ;
at least one spacer arranged between the fixed mirror and the movable mirror , the movable mirror spaced an adjustable distance from the fixed mirror , the movable mirror being configured to reflect a first portion (first portion) of incident light and to allow a second portion (second portion, usage frequency) of the incident light to pass therethrough ;
at least one pressure sensor fixed on the movable mirror and configured for sensing depression of the movable mirror caused by a touch thereon , and generating a signal associated therewith ;
and a processor electrically connected to the pressure sensors of the pixel units and configured to determine a touch position according to the signals from the corresponding sensors .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130002610A1
CLAIM 1
. A touch sensitive display device comprising : an interferometric modulator display panel comprising a plurality of pixel units , each of the pixel units comprising : a fixed mirror including a reflective surface ;
a transmovable mirror ;
at least one spacer arranged between the fixed mirror and the movable mirror , the movable mirror spaced an adjustable distance from the fixed mirror , the movable mirror being configured to reflect a first portion of incident light and to allow a second portion (second portion, usage frequency) of the incident light to pass therethrough ;
at least one pressure sensor fixed on the movable mirror and configured for sensing depression of the movable mirror caused by a touch thereon , and generating a signal associated therewith ;
and a processor electrically connected to the pressure sensors of the pixel units and configured to determine a touch position according to the signals from the corresponding sensors .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (second portion) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130002610A1
CLAIM 1
. A touch sensitive display device comprising : an interferometric modulator display panel comprising a plurality of pixel units , each of the pixel units comprising : a fixed mirror including a reflective surface ;
a transmovable mirror ;
at least one spacer arranged between the fixed mirror and the movable mirror , the movable mirror spaced an adjustable distance from the fixed mirror , the movable mirror being configured to reflect a first portion of incident light and to allow a second portion (second portion, usage frequency) of the incident light to pass therethrough ;
at least one pressure sensor fixed on the movable mirror and configured for sensing depression of the movable mirror caused by a touch thereon , and generating a signal associated therewith ;
and a processor electrically connected to the pressure sensors of the pixel units and configured to determine a touch position according to the signals from the corresponding sensors .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120223900A1

Filed: 2012-02-27     Issued: 2012-09-06

Display device

(Original Assignee) Alps Electric Co Ltd; Denso Ten Ltd     (Current Assignee) Denso Ten Ltd ; Alps Alpine Co Ltd

Motoya JIYAMA, Naoki SUGAMOTO, Nobuyuki Batou, Kiyoshi Hamatani, Kohji Miyazato, Hiroyuki Yanai, Sadaharu YAMAMOTO, Eiji Umetsu, Shuji Yanagi, Masahiko Ishizone
US9645663B2
CLAIM 1
. A display system for an electronic device (on state) comprising : a touch-sensitive display screen (touch panel) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch panel) .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch panel) .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (on state) for the gestural hardware on how a multi-touch input will be processed .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch panel) comprises an electronic device (on state) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 13
. The electronic device (on state) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 14
. An electronic device (on state) comprising : a handheld interactive electronic device having a virtual bezel display screen (touch panel) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 15
. The electronic device (on state) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch panel) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel (display screen, screen mode) that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (on state) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (on state) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (on state) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120223900A1
CLAIM 1
. A display device incorporated in a vehicle , comprising : a touch panel that receives a pressing operation ;
a plurality of detecting units that detect a pressure value on the touch panel ;
a blocking unit that blocks a frequency band corresponding to a vibration state (electronic device) of a vehicle based on the pressure value detected by the detecting unit ;
and a calculating unit that calculates a pressed position based on a result of blocking by the blocking unit .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120105370A1

Filed: 2011-12-23     Issued: 2012-05-03

Electroded Sheet for a Multitude of Products

(Original Assignee) Nupix LLC     (Current Assignee) Nupix LLC

Chad B. Moore
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (antiglare surface, panel display) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120105370A1
CLAIM 18
. The electroded sheet of claim 17 , wherein the electroded sheet forms at least a portion of a device selected from the group consisting of : i) a flat panel display (touchscreen layer) ;
ii) a solar cell ;
iii) a fuel cell ;
iv) a battery ;
v) a resistive touch screen ;
vi) a capacitive touch screen ;
vii) a projective capacitive touch screen ;
viii) a EMI/EMF shield ;
and ix) an antenna .

US20120105370A1
CLAIM 20
. The electroded sheet of claim 17 , further comprising additional surface structure on the electroded sheet selected from the group consisting of : a) a lens array ;
b) a stimpled antiglare surface (touchscreen layer) ;
c) a liquid crystal alignment layer ;
d) a liquid crystal anchoring layer ;
e) at least one channel for gas or liquid ;
and f) any combination of a) through e) .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120105370A1
CLAIM 18
. The electroded sheet of claim 17 , wherein the electroded sheet forms at least a portion of a device selected from the group consisting of : i) a flat panel display ;
ii) a solar cell ;
iii) a fuel cell ;
iv) a battery ;
v) a resistive touch screen (electronic device status display panel) ;
vi) a capacitive touch screen ;
vii) a projective capacitive touch screen ;
viii) a EMI/EMF shield ;
and ix) an antenna .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120105370A1
CLAIM 18
. The electroded sheet of claim 17 , wherein the electroded sheet forms at least a portion of a device selected from the group consisting of : i) a flat panel display ;
ii) a solar cell ;
iii) a fuel cell ;
iv) a battery ;
v) a resistive touch screen (electronic device status display panel) ;
vi) a capacitive touch screen ;
vii) a projective capacitive touch screen ;
viii) a EMI/EMF shield ;
and ix) an antenna .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (antiglare surface, panel display) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120105370A1
CLAIM 18
. The electroded sheet of claim 17 , wherein the electroded sheet forms at least a portion of a device selected from the group consisting of : i) a flat panel display (touchscreen layer) ;
ii) a solar cell ;
iii) a fuel cell ;
iv) a battery ;
v) a resistive touch screen ;
vi) a capacitive touch screen ;
vii) a projective capacitive touch screen ;
viii) a EMI/EMF shield ;
and ix) an antenna .

US20120105370A1
CLAIM 20
. The electroded sheet of claim 17 , further comprising additional surface structure on the electroded sheet selected from the group consisting of : a) a lens array ;
b) a stimpled antiglare surface (touchscreen layer) ;
c) a liquid crystal alignment layer ;
d) a liquid crystal anchoring layer ;
e) at least one channel for gas or liquid ;
and f) any combination of a) through e) .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (wireless communication link) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120105370A1
CLAIM 6
. The electronic display component of claim 4 , wherein the projected capacitive touch sensor further comprises a wireless communication link (s hand) between the touch sensor and a second device .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20130063364A1

Filed: 2011-09-12     Issued: 2013-03-14

Using pressure differences with a touch-sensitive display screen

(Original Assignee) Motorola Mobility LLC     (Current Assignee) Google Technology Holdings LLC

Stephen C. Moore
US9645663B2
CLAIM 1
. A display system for an electronic device (personal communications) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (predefined criterion) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20130063364A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action ;
wherein receiving a touch on the touch-sensitive screen comprises receiving a series of datapoints ;
and wherein associating a pressure with the touch comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion (second set) is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the touch as a function of the baseline datapoint .

US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (personal communications) for the gestural hardware on how a multi-touch input will be processed .
US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (personal communications) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 13
. The electronic device (personal communications) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 14
. An electronic device (personal communications) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20130063364A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action ;
wherein receiving a touch on the touch-sensitive screen comprises receiving a series of datapoints ;
and wherein associating a pressure with the touch comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the touch as a function of the baseline datapoint .

US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 15
. The electronic device (personal communications) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20130063364A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action ;
wherein receiving a touch on the touch-sensitive screen comprises receiving a series of datapoints ;
and wherein associating a pressure with the touch comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the touch as a function of the baseline datapoint .

US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130063364A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action ;
wherein receiving a touch on the touch-sensitive screen comprises receiving a series of datapoints ;
and wherein associating a pressure with the touch comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the touch as a function of the baseline datapoint .

US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (square root) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130063364A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action ;
wherein receiving a touch on the touch-sensitive screen comprises receiving a series of datapoints ;
and wherein associating a pressure with the touch comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the touch as a function of the baseline datapoint .

US20130063364A1
CLAIM 6
. The method of claim 4 wherein the associated pressure of the touch is computed as (an element selected from the group consisting of : an amplitude of a current datapoint and an amplitude of the baseline datapoint) divided by a square root (thermal sensors, s thermal sensors) of a size of the baseline datapoint .

US20130063364A1
CLAIM 12
. The personal electronic device of claim 11 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20130063389A1

Filed: 2011-09-12     Issued: 2013-03-14

Using pressure differences with a touch-sensitive display screen

(Original Assignee) Motorola Mobility LLC     (Current Assignee) Google Technology Holdings LLC

Stephen C. Moore
US9645663B2
CLAIM 1
. A display system for an electronic device (personal communications) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (predefined criterion) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20130063389A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a series of datapoints from the touch-sensitive screen ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action ;
wherein associating a pressure with a datapoint comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion (second set) is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the datapoint as a function of the baseline datapoint .

US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (personal communications) for the gestural hardware on how a multi-touch input will be processed .
US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (personal communications) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 13
. The electronic device (personal communications) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 14
. An electronic device (personal communications) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20130063389A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a series of datapoints from the touch-sensitive screen ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action ;
wherein associating a pressure with a datapoint comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the datapoint as a function of the baseline datapoint .

US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 15
. The electronic device (personal communications) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20130063389A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a series of datapoints from the touch-sensitive screen ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action ;
wherein associating a pressure with a datapoint comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the datapoint as a function of the baseline datapoint .

US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130063389A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a series of datapoints from the touch-sensitive screen ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action ;
wherein associating a pressure with a datapoint comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the datapoint as a function of the baseline datapoint .

US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (square root) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20130063389A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a series of datapoints from the touch-sensitive screen ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action ;
wherein associating a pressure with a datapoint comprises : comparing datapoints in the series with one another until a variation in the datapoints fulfills a first pre-defined criterion ;
when the first predefined criterion is met , defining a baseline datapoint as a current datapoint ;
and computing the associated pressure of the datapoint as a function of the baseline datapoint .

US20130063389A1
CLAIM 6
. The method of claim 4 wherein the associated pressure of the datapoint is computed as (an element selected from the group consisting of : an amplitude of a current datapoint and an amplitude of the baseline datapoint) divided by a square root (thermal sensors, s thermal sensors) of a size of the baseline datapoint .

US20130063389A1
CLAIM 13
. The personal electronic device of claim 12 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2012021417A1

Filed: 2011-08-05     Issued: 2012-02-16

Method and system for adjusting display content

(Original Assignee) Qualcomm Incorporated     

Anthony T. Blow, Babak Forutanpour, Ted R. Gooding, David Bednar
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2012021417A1
CLAIM 21
. The method of claim 1 , wherein adjusting the presentation display to inform the user of the concealed displayed content comprises : determining a plurality of alternative adjustments to the presentation display based upon a layout of the displayed content and the determining displayed content that is concealed by the user' ;
s hand based on the grip event ;
sequentially implementing one of the determined plurality of alternative adjustments to the presentation display ;
detecting a user input (user input) indicating selection of one of the determined plurality of alternative adjustments to the presentation display ;
and implementing the selected one of the determined plurality of alternative adjustments to the presentation display .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
WO2012021417A1
CLAIM 21
. The method of claim 1 , wherein adjusting the presentation display to inform the user of the concealed displayed content comprises : determining a plurality of alternative adjustments to the presentation display based upon a layout of the displayed content and the determining displayed content that is concealed by the user' ;
s hand based on the grip event ;
sequentially implementing one of the determined plurality of alternative adjustments to the presentation display ;
detecting a user input (user input) indicating selection of one of the determined plurality of alternative adjustments to the presentation display ;
and implementing the selected one of the determined plurality of alternative adjustments to the presentation display .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2012021417A1
CLAIM 21
. The method of claim 1 , wherein adjusting the presentation display to inform the user of the concealed displayed content comprises : determining a plurality of alternative adjustments to the presentation display based upon a layout of the displayed content and the determining displayed content that is concealed by the user' ;
s hand based on the grip event ;
sequentially implementing one of the determined plurality of alternative adjustments to the presentation display ;
detecting a user input (user input) indicating selection of one of the determined plurality of alternative adjustments to the presentation display ;
and implementing the selected one of the determined plurality of alternative adjustments to the presentation display .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012021417A1
CLAIM 21
. The method of claim 1 , wherein adjusting the presentation display to inform the user of the concealed displayed content comprises : determining a plurality of alternative adjustments to the presentation display based upon a layout of the displayed content and the determining displayed content that is concealed by the user' ;
s hand based on the grip event ;
sequentially implementing one of the determined plurality of alternative adjustments to the presentation display ;
detecting a user input (user input) indicating selection of one of the determined plurality of alternative adjustments to the presentation display ;
and implementing the selected one of the determined plurality of alternative adjustments to the presentation display .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (sensed location, more sensor) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2012021417A1
CLAIM 17
. The method of claim 1 , further comprising : sensing locations of the user' ;
s fingers on a surface of the mobile device other than the touchscreen display ;
determining a suggested thumb location based upon the sensed location (thermal sensors) of the user' ;
s fingers ;
and displaying the suggested thumb location on the touchscreen display .

WO2012021417A1
CLAIM 21
. The method of claim 1 , wherein adjusting the presentation display to inform the user of the concealed displayed content comprises : determining a plurality of alternative adjustments to the presentation display based upon a layout of the displayed content and the determining displayed content that is concealed by the user' ;
s hand based on the grip event ;
sequentially implementing one of the determined plurality of alternative adjustments to the presentation display ;
detecting a user input (user input) indicating selection of one of the determined plurality of alternative adjustments to the presentation display ;
and implementing the selected one of the determined plurality of alternative adjustments to the presentation display .

WO2012021417A1
CLAIM 80
. The non-transitory processor-readable storage medium of claim 64 , wherein the stored processor-executable instructions are configured to cause a processor of a mobile device to perform operations further comprising : determining locations of the user' ;
s fingers from one or more sensor (thermal sensors) s positioned on a surface of the mobile device other than the touchscreen display ;
determining a suggested thumb location based upon the sensed location of the user' ;
s fingers ;
and displaying the suggested thumb location on the touchscreen display .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JP2013030050A

Filed: 2011-07-29     Issued: 2013-02-07

スクリーンパッドによる入力が可能なユーザインタフェース装置、入力処理方法及びプログラム

(Original Assignee) Kddi Corp; Kddi株式会社     

Tomoaki Matsuki, 友明 松木
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (モニタ) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
JP2013030050A
CLAIM 4
前記接触判定手段によって、当該指が前記スクリーンパッドに接触し且つこの接触位置が所定時間以上継続して変化しなかったと判定された場合、前記動作制御手段は、当該画像全体を移動させることなく固定した状態で当該指の接触位置をモニタ (first portion) する「タッチフォーカス」モードを動作させることを特徴とする請求項1から3のいずれか1項に記載のユーザインタフェース装置。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (モニタ) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
JP2013030050A
CLAIM 4
前記接触判定手段によって、当該指が前記スクリーンパッドに接触し且つこの接触位置が所定時間以上継続して変化しなかったと判定された場合、前記動作制御手段は、当該画像全体を移動させることなく固定した状態で当該指の接触位置をモニタ (first portion) する「タッチフォーカス」モードを動作させることを特徴とする請求項1から3のいずれか1項に記載のユーザインタフェース装置。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120044172A1

Filed: 2011-07-26     Issued: 2012-02-23

Information processing apparatus, program, and operation control method

(Original Assignee) Sony Corp     (Current Assignee) Sony Corp

Yoshihito Ohki, Yusuke MIYAZAWA, Ikuo Yamano
US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (operation control) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (display unit) using predefined set of gestures to toggle a full-screen mode .
US20120044172A1
CLAIM 1
. An information processing apparatus comprising : a detection unit for detecting pressure applied by user input performed on a touch screen ;
a determination unit for determining which of two or more input states the user input belongs to , in accordance with the pressure detected by the detection unit ;
and an operation control (operating system status bar) unit for enabling or disabling a limitation imposed on operation with a user interface displayed on the touch screen , in accordance with the state of the user input determined by the determination unit .

US20120044172A1
CLAIM 3
. The information processing apparatus according to claim 1 , further comprising : a display unit (status bar visibility) for displaying , on the touch screen , the state of the user input determined by the determination unit .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (threshold value) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120044172A1
CLAIM 2
. The information processing apparatus according to claim 1 , wherein the determination unit determines which of two or more input states the user input belongs to by comparing the pressure with a threshold value (s hand) .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120023450A1

Filed: 2011-06-22     Issued: 2012-01-26

User interface device and user interface method

(Original Assignee) Sony Corp     (Current Assignee) Sony Corp

Ryuichiro Noto, Rikizo Tabe, Katsuhiko Nunokawa, Takeshi Yamamoto
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (touch panel, display screen) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch panel, display screen) .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch panel, display screen) .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch panel, display screen) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch panel, display screen) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch panel, display screen) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120023450A1
CLAIM 1
. A user interface device comprising : a display unit on which plural buttons are displayed ;
a touch panel (screen mode, display screen) integrally formed with the display unit and detecting input in plural input areas corresponding to respective display areas of the plural buttons ;
a pressure detection means for detecting water pressure ;
and a layout control means for changing a layout of the buttons and the input areas so as to reduce the number of the buttons and the input areas corresponding to the buttons in accordance with detected magnitude of water pressure when the water pressure detected by the pressure detection means is equal to or higher than a previously set threshold value .

US20120023450A1
CLAIM 6
. The user interface device according to claim 2 , wherein the layout control means changes the layout so that the size of the input area is made to be approximately the same as the size of a display screen (screen mode, display screen) in the display unit when the number of buttons and the input areas corresponding to the buttons is reduced to one .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120306765A1

Filed: 2011-06-01     Issued: 2012-12-06

Using pressure differences with a touch-sensitive display screen

(Original Assignee) Motorola Mobility LLC     (Current Assignee) Google Technology Holdings LLC

Stephen C. Moore
US9645663B2
CLAIM 1
. A display system for an electronic device (personal communications) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120306765A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a plurality of datapoints from the touch-sensitive screen , each datapoint comprising position information ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action .

US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (personal communications) for the gestural hardware on how a multi-touch input will be processed .
US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (personal communications) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 13
. The electronic device (personal communications) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 14
. An electronic device (personal communications) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120306765A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a plurality of datapoints from the touch-sensitive screen , each datapoint comprising position information ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action .

US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 15
. The electronic device (personal communications) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120306765A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a plurality of datapoints from the touch-sensitive screen , each datapoint comprising position information ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action .

US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120306765A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a plurality of datapoints from the touch-sensitive screen , each datapoint comprising position information ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action .

US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120306765A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a plurality of datapoints from the touch-sensitive screen , each datapoint comprising position information ;
for each of a plurality of the received datapoints , associating a pressure with the datapoint ;
associating at least one rate of change of pressure with at least a subset of the datapoints ;
and based , at least in part , on the associated rate-of-change-of-pressure information , performing a user-interface action .

US20120306765A1
CLAIM 9
. The personal electronic device of claim 8 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120306766A1

Filed: 2011-06-01     Issued: 2012-12-06

Using pressure differences with a touch-sensitive display screen

(Original Assignee) Motorola Mobility LLC     (Current Assignee) Google Technology Holdings LLC

Stephen C. Moore
US9645663B2
CLAIM 1
. A display system for an electronic device (personal communications) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120306766A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action .

US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (personal communications) for the gestural hardware on how a multi-touch input will be processed .
US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (personal communications) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 13
. The electronic device (personal communications) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 14
. An electronic device (personal communications) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120306766A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action .

US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 15
. The electronic device (personal communications) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120306766A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action .

US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120306766A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action .

US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (personal communications) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120306766A1
CLAIM 1
. On a personal electronic device with a touch-sensitive screen , a method for responding to user input (user input) , the method comprising : receiving a touch on the touch-sensitive screen ;
associating a pressure with the touch ;
comparing the associated pressure with a first non-zero threshold ;
and if the associated pressure is less than the first threshold , then performing a first user-interface action , else performing a second user-interface action distinct from the first user-interface action .

US20120306766A1
CLAIM 2
. The method of claim 1 wherein the personal electronic device is selected from the group consisting of : mobile telephone , personal communications (electronic device) device , personal computer , tablet computer , kiosk , digital sign , and gaming console .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120254747A1

Filed: 2011-03-30     Issued: 2012-10-04

Methods, apparatuses and computer program products for generating regions of interest using gestures via a user interface

(Original Assignee) McKesson Financial Holdings ULC     (Current Assignee) Change Healthcare Holdings LLC

Radu Catalin Bocirnea
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (more regions) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (generate one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120254747A1
CLAIM 1
. A method comprising : receiving an indication of one or more touches at a corresponding one or more locations on a touch enabled display , respective touches defining a touch point at the corresponding location ;
generating , via a processor , one or more regions (first portion) of interest associated with one or more areas of at least one medical image in response to receipt of the one or more indications , a location of respective regions of interest corresponding to the location of respective one or more touch points ;
defining a diameter of respective regions of interest based in part on a width or an amount of pressure of the corresponding touch points ;
and defining each of the regions of interest to comprise at least one disc comprising one or more contours , wherein respective regions of interest correspond to an area for annotating the medical image .

US20120254747A1
CLAIM 10
. An apparatus comprising : at least one memory ;
and at least one processor configured to cause the apparatus to : receive an indication of one or more touches at a corresponding one or more locations on a touch enabled display , respective touches defining a touch point at the corresponding location ;
generate one (second set) or more regions of interest associated with one or more areas of at least one medical image in response to receipt of the one or more indications , a location of respective regions of interest corresponding to the location of respective one or more touch points ;
define a diameter of respective regions of interest based in part on a width or an amount of pressure of the corresponding touch points ;
and define each of the regions of interest to comprise at least one disc comprising one or more contours , wherein respective regions of interest correspond to an area for annotating the medical image .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (more regions) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120254747A1
CLAIM 1
. A method comprising : receiving an indication of one or more touches at a corresponding one or more locations on a touch enabled display , respective touches defining a touch point at the corresponding location ;
generating , via a processor , one or more regions (first portion) of interest associated with one or more areas of at least one medical image in response to receipt of the one or more indications , a location of respective regions of interest corresponding to the location of respective one or more touch points ;
defining a diameter of respective regions of interest based in part on a width or an amount of pressure of the corresponding touch points ;
and defining each of the regions of interest to comprise at least one disc comprising one or more contours , wherein respective regions of interest correspond to an area for annotating the medical image .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120254747A1
CLAIM 8
. The method of claim 5 , wherein splitting the at least one region of interest into two disjoined regions comprises generating a first region (holding pattern) of interest associated with a first disjoined region of the two disjoined regions and a second region of interest associated with a second disjoined region of the two disjoined regions and the method further comprises : determining that the data associated with the two disjoined regions are no longer united .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (first region) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120254747A1
CLAIM 8
. The method of claim 5 , wherein splitting the at least one region of interest into two disjoined regions comprises generating a first region (holding pattern) of interest associated with a first disjoined region of the two disjoined regions and a second region of interest associated with a second disjoined region of the two disjoined regions and the method further comprises : determining that the data associated with the two disjoined regions are no longer united .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
WO2011126893A2

Filed: 2011-03-30     Issued: 2011-10-13

Apparatuses enabling concurrent communication between an interface die and a plurality of dice stacks, interleaved conductive paths in stacked devices, and methods for forming and operating the same

(Original Assignee) Micron Technology, Inc.     

Brent Keeth, Christopher K. Morzano
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (command signal) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device status display panel (command signal) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (command signal) and the pre-defined set of touch-based soft buttons are in a hidden mode .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (command signal) intended to affect the display of the first portion of the content on the active touchscreen region .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (command signal) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (command signal) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature (third position) from a user' ;

s hand (command signal) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (command signal) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
WO2011126893A2
CLAIM 39
. The method of claim 38 , further comprising : positioning a third dice stack at a third position (heat signature) on the interface die ;
and positioning a fourth dice stack at a fourth position on the interface die such that the interface die can be configured to enable concurrent communications with the first dice stack , the second dice stack , the third dice stack , and the fourth dice stack .

WO2011126893A2
CLAIM 49
. The method of claim 46 , wherein the first set and the second set of communications comprise address signals and/or command signal (electronic device status display panel, user input, user input area, s hand) s .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JP2012128825A

Filed: 2011-03-23     Issued: 2012-07-05

電子機器、表示制御方法、およびプログラム

(Original Assignee) Sharp Corp; シャープ株式会社     

Masaya Azuma, 真哉 東
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (タッチ) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (タッチ) .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (タッチ) .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (タッチ) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (タッチ) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (タッチ) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
JP2012128825A
CLAIM 3
前記デバイスは、前記ディスプレイに表示されるポインタの位置を指定するためのタッチ (display screen, screen mode) パッドであって、 前記プロセッサは、前記ポインタの位置が指定されると、当該ポインタの位置の画像と、当該ポインタの位置の周囲の画像とを、前記3次元表示させる、請求項2に記載の電子機器。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (前記センサ) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
JP2012128825A
CLAIM 18
画像の3次元表示が可能な電子機器における表示制御方法であって、 前記電子機器は、プロセッサと、前記画像を表した画像データを格納したメモリと、前記画像を表示するためのディスプレイと、押圧力を検知するためのセンサとを備え、 前記ディスプレイは、前記電子機器の筐体の第1面に設けられ、前記センサ (thermal sensors) は、前記第1面の裏面である前記筐体の第2面に設けられ、前記ディスプレイ方向への前記押圧力を検知し、 前記表示制御方法は、 前記センサが、前記押圧力を検知するステップと、 前記プロセッサが、前記押圧力が検知されると、前記ディスプレイにおける少なくとも一部の表示領域の画像を、前記ディスプレイから飛び出す態様で前記ディスプレイに3次元表示させるステップとを備える、表示制御方法。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120240044A1

Filed: 2011-03-20     Issued: 2012-09-20

System and method for summoning user interface objects

(Original Assignee) Johnson William J; Johnson Jason M     

William J. Johnson, Jason M. Johnson
US9645663B2
CLAIM 1
. A display system (first location) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input (user input) position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input (second mode) position of said display in response to said request .

US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 2
. The display system (first location) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 3
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 4
. The display system (first location) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 5
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 6
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 7
. The display system (first location) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 8
. The display system (first location) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 9
. The display system (first location) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 10
. The display system (first location) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 11
. The display system (first location) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 12
. The display system (first location) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120240044A1
CLAIM 20
) A data processing system comprising : display means for displaying a plurality of user interface objects ;
processing means for accepting at a user input location of said display a user search request for one or more user interface objects ;
processing means for removing said one or more user interface objects from a first location (display system) of said display ;
and processing means for redisplaying at said input location information for said one or more user interface objects in response to said request .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input (user input) position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input (second mode) position of said display in response to said request .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode (said input) of response in the virtual bezel region .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input (second mode) position of said display in response to said request .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input (user input) position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input position of said display in response to said request .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices (said object) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input (user input) position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input position of said display in response to said request .

US20120240044A1
CLAIM 6
) The method of claim 1 wherein said automatically moving said user interface object from a first position of said display to said input position of said display in response to said request comprises transitioning said object (area comprising vertices) for visual or audio animation .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices (said object) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120240044A1
CLAIM 1
) A method in a data processing system for a user to summon a user interface object of a display of said data processing system , said method comprising : displaying a plurality of user interface objects to said display ;
recognizing a user input (user input) position of said display and a user search request for a user interface object of said user interface objects ;
and automatically moving said user interface object from a first position of said display to said input position of said display in response to said request .

US20120240044A1
CLAIM 6
) The method of claim 1 wherein said automatically moving said user interface object from a first position of said display to said input position of said display in response to said request comprises transitioning said object (area comprising vertices) for visual or audio animation .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2500898A1

Filed: 2011-03-18     Issued: 2012-09-19

System and method for foldable display

(Original Assignee) Research in Motion Ltd     (Current Assignee) BlackBerry Ltd

W. Garland Phillips
US9645663B2
CLAIM 1
. A display system (display system) for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (computing system) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (generate one) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

EP2500898A1
CLAIM 7
A system according to claim 1 , wherein the fold signal indicative of the fold in the display unit is generated automatically or is generated in response to a user input (user input) .

EP2500898A1
CLAIM 14
A computer program product comprising computer program instructions recorded in non-transitory form on a machine-readable media , the instructions adapted to execute on a computing system (touchscreen layer, touchscreen display) to : monitor a signal indicative of a fold in a display unit ;
and in response to the signal indicative of a fold in the display unit , generate one (second set) or more display control signals to change the display of visual information to at least one second display format , wherein the second display format compensates for impairment in the display of visual information in a first format that is introduced by folding the display unit .

US9645663B2
CLAIM 2
. The display system (display system) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 3
. The display system (display system) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 4
. The display system (display system) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 5
. The display system (display system) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 6
. The display system (display system) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 7
. The display system (display system) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 8
. The display system (display system) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 9
. The display system (display system) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 10
. The display system (display system) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 11
. The display system (display system) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 12
. The display system (display system) according to claim 9 , wherein the display screen comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2500898A1
CLAIM 1
A display system (display system) , comprising : a foldable display unit to display visual information , the visual information based at least in part on display data ;
and a display control unit receiving the display data and at least one fold signal indicative of a fold in at least a portion of the display unit , and outputting one or more display control signals operative to cause the display unit to display the visual information , the display control signals being output at least in part in response to the at least one fold signal to adapt the one or more display control signals to change a format of the visual information .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (computing system) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (user input) intended to affect the display of the first portion of the content on the active touchscreen region .
EP2500898A1
CLAIM 7
A system according to claim 1 , wherein the fold signal indicative of the fold in the display unit is generated automatically or is generated in response to a user input (user input) .

EP2500898A1
CLAIM 14
A computer program product comprising computer program instructions recorded in non-transitory form on a machine-readable media , the instructions adapted to execute on a computing system (touchscreen layer, touchscreen display) to : monitor a signal indicative of a fold in a display unit ;
and in response to the signal indicative of a fold in the display unit , generate one or more display control signals to change the display of visual information to at least one second display format , wherein the second display format compensates for impairment in the display of visual information in a first format that is introduced by folding the display unit .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
EP2500898A1
CLAIM 7
A system according to claim 1 , wherein the fold signal indicative of the fold in the display unit is generated automatically or is generated in response to a user input (user input) .

EP2500898A1
CLAIM 14
A computer program product comprising computer program instructions recorded in non-transitory form on a machine-readable media , the instructions adapted to execute on a computing system (touchscreen layer, touchscreen display) to : monitor a signal indicative of a fold in a display unit ;
and in response to the signal indicative of a fold in the display unit , generate one or more display control signals to change the display of visual information to at least one second display format , wherein the second display format compensates for impairment in the display of visual information in a first format that is introduced by folding the display unit .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (user input) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2500898A1
CLAIM 7
A system according to claim 1 , wherein the fold signal indicative of the fold in the display unit is generated automatically or is generated in response to a user input (user input) .

EP2500898A1
CLAIM 14
A computer program product comprising computer program instructions recorded in non-transitory form on a machine-readable media , the instructions adapted to execute on a computing system (touchscreen layer, touchscreen display) to : monitor a signal indicative of a fold in a display unit ;
and in response to the signal indicative of a fold in the display unit , generate one or more display control signals to change the display of visual information to at least one second display format , wherein the second display format compensates for impairment in the display of visual information in a first format that is introduced by folding the display unit .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (computing system) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2500898A1
CLAIM 7
A system according to claim 1 , wherein the fold signal indicative of the fold in the display unit is generated automatically or is generated in response to a user input (user input) .

EP2500898A1
CLAIM 14
A computer program product comprising computer program instructions recorded in non-transitory form on a machine-readable media , the instructions adapted to execute on a computing system (touchscreen layer, touchscreen display) to : monitor a signal indicative of a fold in a display unit ;
and in response to the signal indicative of a fold in the display unit , generate one or more display control signals to change the display of visual information to at least one second display format , wherein the second display format compensates for impairment in the display of visual information in a first format that is introduced by folding the display unit .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20130021295A1

Filed: 2011-03-16     Issued: 2013-01-24

Display device with touch panel function

(Original Assignee) Sharp Corp     (Current Assignee) Sharp Corp

Tomohiro Kimura, Keiichi Fukuyama, Yasuyuki Ogawa
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (crystal display device, projection area, display screen) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (inner side) with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input (second mode) screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .

US20130021295A1
CLAIM 12
. The display device with a touch panel function according to claim 1 , which is a liquid crystal display device having a structure in which liquid crystal is sealed by a seal member , wherein said first sensor is arranged at a position adjacent to said seal member on an inner side (touchscreen layer) thereof .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (crystal display device, projection area, display screen) .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (crystal display device, projection area, display screen) .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (crystal display device, projection area, display screen) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (crystal display device, projection area, display screen) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (inner side) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input (second mode) screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .

US20130021295A1
CLAIM 12
. The display device with a touch panel function according to claim 1 , which is a liquid crystal display device having a structure in which liquid crystal is sealed by a seal member , wherein said first sensor is arranged at a position adjacent to said seal member on an inner side (touchscreen layer) thereof .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (crystal display device, projection area, display screen) , the gestural software application configured to produce the second mode (said input) of response in the virtual bezel region .
US20130021295A1
CLAIM 1
. A display device with a touch panel function , comprising : a first sensor for detecting presence/absence of pressurization on an input screen ;
a second sensor for detecting a contact position on said input (second mode) screen , whose power consumption for waiting in a detectable state is higher than power consumption of said first sensor ;
and a control unit for switching said second sensor into the detectable state when presence of pressurization is detected by said first sensor , wherein said input screen also serves as a display screen (screen mode, display screen) .

US20130021295A1
CLAIM 8
. The display device with a touch panel function according to claim 1 , wherein at least some of said first sensors are arranged in a projection area (screen mode, display screen) of said input screen .

US20130021295A1
CLAIM 10
. The display device with a touch panel function according to claim 8 , which is a liquid crystal display device (screen mode, display screen) having a black matrix portion for separating different pixels from each other by a member through which visible light is not passed , wherein said first sensors are arranged in said black matrix portion .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US8373675B2

Filed: 2011-02-01     Issued: 2013-02-12

Display panel, display apparatus having the same, and method thereof

(Original Assignee) Samsung Display Co Ltd     (Current Assignee) Samsung Display Co Ltd

Jin Jeon, Hyung-Guel Kim
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (first direction) of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (sensing electrodes) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US8373675B2
CLAIM 4
. The display apparatus of claim 1 , wherein the array substrate further comprises a third sensing electrode which is spaced apart from the first and the second sensing electrodes (first portion) , and the connecting member overlapping with at least a portion of the third sensing electrode .

US8373675B2
CLAIM 9
. A display panel comprising : an array substrate including : a data line ;
a gate line crossing the data line ;
a first signal line extending in a first direction (first mode) ;
a pixel including a pixel electrode and a common electrode insulated from the pixel electrode ;
and a first sensing electrode which is electrically connected to the first signal line ;
an opposite substrate combined with the array substrate and including a connecting member overlapping with at least a portion of the first sensing electrode , and electrically connected to the at least a portion of the first sensing electrode by an externally provided pressure ;
and a liquid crystal layer interposed between the array substrate and the opposite substrate and including liquid crystal molecules , of which an alignment is controlled by an electric field formed between the common electrode and the pixel electrode .

US9645663B2
CLAIM 2
. The display system according to claim 1 , wherein the gestural software application is configured to produce the first mode (first direction) of response in the active touchscreen region .
US8373675B2
CLAIM 9
. A display panel comprising : an array substrate including : a data line ;
a gate line crossing the data line ;
a first signal line extending in a first direction (first mode) ;
a pixel including a pixel electrode and a common electrode insulated from the pixel electrode ;
and a first sensing electrode which is electrically connected to the first signal line ;
an opposite substrate combined with the array substrate and including a connecting member overlapping with at least a portion of the first sensing electrode , and electrically connected to the at least a portion of the first sensing electrode by an externally provided pressure ;
and a liquid crystal layer interposed between the array substrate and the opposite substrate and including liquid crystal molecules , of which an alignment is controlled by an electric field formed between the common electrode and the pixel electrode .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (first direction) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (sensing electrodes) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US8373675B2
CLAIM 4
. The display apparatus of claim 1 , wherein the array substrate further comprises a third sensing electrode which is spaced apart from the first and the second sensing electrodes (first portion) , and the connecting member overlapping with at least a portion of the third sensing electrode .

US8373675B2
CLAIM 9
. A display panel comprising : an array substrate including : a data line ;
a gate line crossing the data line ;
a first signal line extending in a first direction (first mode) ;
a pixel including a pixel electrode and a common electrode insulated from the pixel electrode ;
and a first sensing electrode which is electrically connected to the first signal line ;
an opposite substrate combined with the array substrate and including a connecting member overlapping with at least a portion of the first sensing electrode , and electrically connected to the at least a portion of the first sensing electrode by an externally provided pressure ;
and a liquid crystal layer interposed between the array substrate and the opposite substrate and including liquid crystal molecules , of which an alignment is controlled by an electric field formed between the common electrode and the pixel electrode .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (electric field) , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US8373675B2
CLAIM 1
. A display apparatus comprising : a display panel comprising : an array substrate including a data line , a gate line crossing the data line , a first signal line substantially parallel to the gate line , a second signal line substantially parallel to the data line , a pixel including a pixel electrode and a common electrode insulated from the pixel electrode , a first sensing electrode electrically connected to the first signal line , and a second sensing electrode electrically connected to the second signal line ;
an opposite substrate combined with the array substrate to receive a liquid crystal layer and including a connecting member overlapping with at least a portion of the first sensing electrode and at least a portion of the second sensing electrode , the connecting member being connected to the at least a portion of the first sensing electrode and the at least a portion of the sensing electrode by an externally provided pressure ;
and a liquid crystal layer interposed between the array substrate and the opposite substrate and including liquid crystal molecules , of which an alignment is controlled by an electric field (touchscreen area) formed between the common electrode and the pixel electrode ;
a touch position detecting part detecting the first and the second signal lines electrically connected to the connecting member to output a detection signal ;
and a position determining part determining position coordinates of the externally provided pressure based on the detection signal .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120162087A1

Filed: 2010-12-22     Issued: 2012-06-28

Cover glass button for display of mobile device

(Original Assignee) Universal Cement Corp     (Current Assignee) Universal Cement Corp

Chih Sheng Hou
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (touch panel) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch panel) .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch panel) .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch panel) comprises an electronic device status display panel displaying at least one information item from a set of information items (said platform) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .

US20120162087A1
CLAIM 5
. A cover glass button for a mobile device as claimed in claim 4 , wherein said platform (information items) is composed of a left platform , a right platform , a top platform , and a bottom platform .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch panel) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch panel) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120162087A1
CLAIM 2
. A cover glass button for a mobile device as claimed in claim 1 , further comprising : a touch panel (display screen, screen mode) , configured on bottom of said cover glass ;
an LCD module , configured on bottom of said touch panel ;
and an electronics compartment , configured on bottom of said LCD module .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120032876A1

Filed: 2010-10-24     Issued: 2012-02-09

Mega communication and media apparatus configured to provide faster data transmission speed and to generate electrical energy

(Original Assignee) Joseph Akwo Tabe     

Joseph Akwo Tabe
US9645663B2
CLAIM 1
. A display system (rate signals) for an electronic device (electronic device) comprising : a touch-sensitive display screen (electronic wafer) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (telecommunications system) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (textile fibers, common node) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (structured data, remote device) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (discharge cycles, output port, user input) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120032876A1
CLAIM 16
. A communication device of claim 1 , wherein said sensor apparatus comprises at least one of : a display device ;
an input/out device ;
digital video broadcast device ;
an entertainment device ;
a digital audio broadcast device ;
digital multimedia broadcast device ;
a global positioning system ;
safety services ;
a transportation road communication systems ;
CMOS multiples on chip device ;
a universal mobile telecommunications system (first set) ;
a touch screen input/output device operable for interactive communications .

US20120032876A1
CLAIM 24
. A communication device of claim 22 , wherein said at least one source further include at least one of : carbon char ;
carbon black ;
metal sulfides ;
metal oxides ;
organic materials ;
textile fibers (first portion) ;
zinc oxide (ZnO) ;
nano-wires ;
piezoelectric crystals ;
a sensory layer ;
wet etching ;
dry etching ;
electron-silicon substrate-oxide ;
metal oxide semiconductor ;
optical properties ;
glass fiber ;
substrate micro fiber ;
cell platform ;
solar cell ;
meta-material ;
wherein said at least one source is alloyed with silicon substrate microfiber material .

US20120032876A1
CLAIM 25
. A communication device of claim 1 , wherein said power management module further comprises a cell platform comprising at least one of nickel-cadmium batteries (NiCd) ;
nickel oxide hydroxide ;
metallic cadmium ;
wafer module ;
capacitor ;
complementary metal oxide semiconductor ;
wherein said capacitor operatively configured to withstand higher number of charge/discharge cycles (holding pattern, user input, s thermal sensors) and faster charge and discharge rates .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 51
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one on web based application ;
a photovoltaic device ;
a support system ;
a content index ;
a computer apparatus ;
a game device , a television device ;
an entertainment device ;
a protocol for delivering structured data (second set) ;
an advertisement platform ;
cache memory for caching structural functions ;
a mapping circuit indexed by at least a pattern ID value ;
a social network ;
Internet protocol television communication system ;
a video ;
a chat platform ;
virtual private network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US20120032876A1
CLAIM 65
. A communication device of claim 64 , wherein said input/output device in communication with at least a remote device (second set) .

US20120032876A1
CLAIM 74
. A communication device of claim 1 , wherein said communication apparatus further comprising at least one of : at least an input/output device configured with at least an output port (holding pattern, user input, s thermal sensors) ;
means for generating downlink signals ;
means for transmitting downlink signals ;
means for transmitting at least a downlink signals with overlapping frequencies ;
means for generating downlink signals of different communication contents ;
at least a transceiver having at least an uplink receive port to receive at least an uplink signal ;
said at least one antenna apparatus configured with said communication apparatus for providing at least a non-overlapping coverage area .

US20120032876A1
CLAIM 76
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : means for maximizing at least a boundary coverage area ;
means for minimizing at least a boundary coverage area ;
means for accepting at least user input (holding pattern, user input, s thermal sensors) ;
at least a reconfiguration switch apparatus ;
at least a hard-wired-signal splitter means ;
at least a time division duplexed at said at least one antenna apparatus ;
at least a frequency division duplexed at said at least one antenna apparatus ;
means for minimizing boundaries between coverage areas ;
at least a duplexer apparatus ;
means for dividing communication signals into plurality communication channels ;
means for increasing cell coverage ;
means for lowering output power ;
at least a multiple input-multiple output apparatus .

US20120032876A1
CLAIM 85
. A communication device of claim 1 , wherein said antenna apparatus further comprising at least one of : means for associating radiating RF fields with said means for feeding electromagnetic signals ;
at least an antenna having at least a grounding portion operable for radiating RF fields between the silicon substrate and the shorted end portion ;
providing the shorted end portion with at least a common node (first portion) ;
shielding the means for radiating RF fields to increase conductive compensation effect ;
shielding the means for radiating RF fields to increase capacitive compensation effects ;
apparatus for radiating electromagnetic signals ;
providing at least a feeding portion for feeding electromagnetic signals ;
at least an integrated rectifier ;
at least means for converting infrared/THz electromagnetic radiation into DC power ;
means for rectifying at least an induced voltage to at least terahertz frequency ;
at least a receiving nano-antenna in association with a rectifying circuit ;
at least a broadband rectifying antennas ;
at least a CMOS active circulator ;
at least on-chip-inductance comprising a frequency dependent apparatus ;
at least a reflector comprising at least a meta-material resonant cavity ;
at least a silicon CMOS comprising at least one of : an FPGA layer , a chip , operable for wireless data transmission .

US9645663B2
CLAIM 2
. The display system (rate signals) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 3
. The display system (rate signals) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 4
. The display system (rate signals) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 5
. The display system (rate signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (electronic wafer) .
US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 6
. The display system (rate signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (electronic wafer) .
US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 7
. The display system (rate signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 8
. The display system (rate signals) according to claim 1 , wherein an operating system status bar (vertical movement) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120032876A1
CLAIM 19
. A Communication device of claim 18 , wherein said object further comprises at least a near solid object including a human finger ;
said movement further comprises at least one of : vertical movement (operating system status bar) ;
horizontal movement ;
diagonal movement .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 9
. The display system (rate signals) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 10
. The display system (rate signals) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US9645663B2
CLAIM 11
. The display system (rate signals) according to claim 9 , wherein the user can add one (receiving port, other node) or more touch-based soft buttons within the virtual bezel region .
US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US20120032876A1
CLAIM 75
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : at least a determination device for determining at least a source-side wireless communication device ;
at least an authentication memory device ;
at least a sink-side wireless communication device ;
at least a multi-level shielding apparatus operable for attenuation of unwanted signals ;
at least an operational bandwidth comprising wave frequencies ;
at least a cellular radio transceiver ;
at least a graphic user interface comprising at least a human speech interface ;
at least a global positioning satellite receiver ;
means for routing at least an uplink signal from at least an unit device to at least a receiving port (add one) .

US20120032876A1
CLAIM 79
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of at least a telecommunication network ;
at least a telecommunication network operable for providing backhaul communication between at least a communication switching center and at least a base station ;
at least a network node at spaced apart sites ;
means for transporting information to other node (add one) s in at least said communication network ;
means for providing information exchange between at least said communication network and at least a network of users ;
at least a base station access point ;
at least a wireless communication access point ;
at least a wireless communication path comprising at least a millimeter wave link .

US9645663B2
CLAIM 12
. The display system (rate signals) according to claim 9 , wherein the display screen (electronic wafer) comprises an electronic device (electronic device) status display panel (said communication network, touch screen) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120032876A1
CLAIM 16
. A communication device of claim 1 , wherein said sensor apparatus comprises at least one of : a display device ;
an input/out device ;
digital video broadcast device ;
an entertainment device ;
a digital audio broadcast device ;
digital multimedia broadcast device ;
a global positioning system ;
safety services ;
a transportation road communication systems ;
CMOS multiples on chip device ;
a universal mobile telecommunications system ;
a touch screen (electronic device status display panel) input/output device operable for interactive communications .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 56
. A communication device of claim 1 , wherein said communications apparatus further comprising at least one of a transmitter ;
a receiver ;
each configured with at least one CMOS antenna apparatus on a chip operatively configured to separate signals (display system) normal to at least one of : an audio device ;
a cell phone device ;
an electronic data transmission device ;
energy harvesting .

US20120032876A1
CLAIM 79
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of at least a telecommunication network ;
at least a telecommunication network operable for providing backhaul communication between at least a communication switching center and at least a base station ;
at least a network node at spaced apart sites ;
means for transporting information to other nodes in at least said communication network (electronic device status display panel) ;
means for providing information exchange between at least said communication network and at least a network of users ;
at least a base station access point ;
at least a wireless communication access point ;
at least a wireless communication path comprising at least a millimeter wave link .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (said communication network, touch screen) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120032876A1
CLAIM 16
. A communication device of claim 1 , wherein said sensor apparatus comprises at least one of : a display device ;
an input/out device ;
digital video broadcast device ;
an entertainment device ;
a digital audio broadcast device ;
digital multimedia broadcast device ;
a global positioning system ;
safety services ;
a transportation road communication systems ;
CMOS multiples on chip device ;
a universal mobile telecommunications system ;
a touch screen (electronic device status display panel) input/output device operable for interactive communications .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 79
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of at least a telecommunication network ;
at least a telecommunication network operable for providing backhaul communication between at least a communication switching center and at least a base station ;
at least a network node at spaced apart sites ;
means for transporting information to other nodes in at least said communication network (electronic device status display panel) ;
means for providing information exchange between at least said communication network and at least a network of users ;
at least a base station access point ;
at least a wireless communication access point ;
at least a wireless communication path comprising at least a millimeter wave link .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen (electronic wafer) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (textile fibers, common node) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (discharge cycles, output port, user input) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120032876A1
CLAIM 24
. A communication device of claim 22 , wherein said at least one source further include at least one of : carbon char ;
carbon black ;
metal sulfides ;
metal oxides ;
organic materials ;
textile fibers (first portion) ;
zinc oxide (ZnO) ;
nano-wires ;
piezoelectric crystals ;
a sensory layer ;
wet etching ;
dry etching ;
electron-silicon substrate-oxide ;
metal oxide semiconductor ;
optical properties ;
glass fiber ;
substrate micro fiber ;
cell platform ;
solar cell ;
meta-material ;
wherein said at least one source is alloyed with silicon substrate microfiber material .

US20120032876A1
CLAIM 25
. A communication device of claim 1 , wherein said power management module further comprises a cell platform comprising at least one of nickel-cadmium batteries (NiCd) ;
nickel oxide hydroxide ;
metallic cadmium ;
wafer module ;
capacitor ;
complementary metal oxide semiconductor ;
wherein said capacitor operatively configured to withstand higher number of charge/discharge cycles (holding pattern, user input, s thermal sensors) and faster charge and discharge rates .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 74
. A communication device of claim 1 , wherein said communication apparatus further comprising at least one of : at least an input/output device configured with at least an output port (holding pattern, user input, s thermal sensors) ;
means for generating downlink signals ;
means for transmitting downlink signals ;
means for transmitting at least a downlink signals with overlapping frequencies ;
means for generating downlink signals of different communication contents ;
at least a transceiver having at least an uplink receive port to receive at least an uplink signal ;
said at least one antenna apparatus configured with said communication apparatus for providing at least a non-overlapping coverage area .

US20120032876A1
CLAIM 76
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : means for maximizing at least a boundary coverage area ;
means for minimizing at least a boundary coverage area ;
means for accepting at least user input (holding pattern, user input, s thermal sensors) ;
at least a reconfiguration switch apparatus ;
at least a hard-wired-signal splitter means ;
at least a time division duplexed at said at least one antenna apparatus ;
at least a frequency division duplexed at said at least one antenna apparatus ;
means for minimizing boundaries between coverage areas ;
at least a duplexer apparatus ;
means for dividing communication signals into plurality communication channels ;
means for increasing cell coverage ;
means for lowering output power ;
at least a multiple input-multiple output apparatus .

US20120032876A1
CLAIM 85
. A communication device of claim 1 , wherein said antenna apparatus further comprising at least one of : means for associating radiating RF fields with said means for feeding electromagnetic signals ;
at least an antenna having at least a grounding portion operable for radiating RF fields between the silicon substrate and the shorted end portion ;
providing the shorted end portion with at least a common node (first portion) ;
shielding the means for radiating RF fields to increase conductive compensation effect ;
shielding the means for radiating RF fields to increase capacitive compensation effects ;
apparatus for radiating electromagnetic signals ;
providing at least a feeding portion for feeding electromagnetic signals ;
at least an integrated rectifier ;
at least means for converting infrared/THz electromagnetic radiation into DC power ;
means for rectifying at least an induced voltage to at least terahertz frequency ;
at least a receiving nano-antenna in association with a rectifying circuit ;
at least a broadband rectifying antennas ;
at least a CMOS active circulator ;
at least on-chip-inductance comprising a frequency dependent apparatus ;
at least a reflector comprising at least a meta-material resonant cavity ;
at least a silicon CMOS comprising at least one of : an FPGA layer , a chip , operable for wireless data transmission .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (electronic wafer) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display (photovoltaic array, electrical power, digital camera, visual device, service provider) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (discharge cycles, output port, user input) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120032876A1
CLAIM 25
. A communication device of claim 1 , wherein said power management module further comprises a cell platform comprising at least one of nickel-cadmium batteries (NiCd) ;
nickel oxide hydroxide ;
metallic cadmium ;
wafer module ;
capacitor ;
complementary metal oxide semiconductor ;
wherein said capacitor operatively configured to withstand higher number of charge/discharge cycles (holding pattern, user input, s thermal sensors) and faster charge and discharge rates .

US20120032876A1
CLAIM 39
. A communication device of claim 1 , wherein said communication apparatus further comprises circuit board comprising electronic system' ;
s applications being configured for at least one of : a wired communications device ;
a wireless communications device ;
a cell phone ;
a handheld communication device ;
laptop computer ;
desktop computer ;
telemetry device ;
a switching device ;
MP3 player ;
a router ;
a repeater , a codec ;
a LAN ;
a WLAN ;
a Bluetooth enabled device ;
a digital camera (touchscreen area, user input area, touchscreen display) ;
a digital audio player and/or recorder ;
a digital video player and/or recorder ;
a computer ;
a monitor ;
a television set ;
a satellite set top box ;
a cable modem ;
a digital automotive control system ;
a control module ;
a communication module ;
a digitally-controlled home appliance ;
a printer ;
a copier ;
a digital audio or video receiver ;
an RF transceiver ;
a personal digital assistant (PDA) ;
a digital game playing device ;
a digital testing and/or measuring device ;
a digital avionics device ;
a media device ;
a medical device ;
a digitally-controlled medical equipment .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power (touchscreen area, user input area, touchscreen display) generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer module ;
photovoltaic array (touchscreen area, user input area, touchscreen display) ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 59
. A communication device of claim 1 , further comprises an audio device configured with at least an audio/visual device (touchscreen area, user input area, touchscreen display) operable for at least one of inputting communications ;
outputting communications ;
wherein said at least one audio/visual device further comprises at least one of : a touch screen input/output device ;
at least a speaker device ;
at least a microphone device ;
said speaker device ;
at least a voice enabled communications device , including Voice Over Internet Protocol “VOIP .

US20120032876A1
CLAIM 74
. A communication device of claim 1 , wherein said communication apparatus further comprising at least one of : at least an input/output device configured with at least an output port (holding pattern, user input, s thermal sensors) ;
means for generating downlink signals ;
means for transmitting downlink signals ;
means for transmitting at least a downlink signals with overlapping frequencies ;
means for generating downlink signals of different communication contents ;
at least a transceiver having at least an uplink receive port to receive at least an uplink signal ;
said at least one antenna apparatus configured with said communication apparatus for providing at least a non-overlapping coverage area .

US20120032876A1
CLAIM 76
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : means for maximizing at least a boundary coverage area ;
means for minimizing at least a boundary coverage area ;
means for accepting at least user input (holding pattern, user input, s thermal sensors) ;
at least a reconfiguration switch apparatus ;
at least a hard-wired-signal splitter means ;
at least a time division duplexed at said at least one antenna apparatus ;
at least a frequency division duplexed at said at least one antenna apparatus ;
means for minimizing boundaries between coverage areas ;
at least a duplexer apparatus ;
means for dividing communication signals into plurality communication channels ;
means for increasing cell coverage ;
means for lowering output power ;
at least a multiple input-multiple output apparatus .

US20120032876A1
CLAIM 81
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : at least a cellular base station ;
at least a WiMax base station ;
at least an LTE base station ;
at least a mobile telephone switching center ;
at least a telecommunication service provider (touchscreen area, user input area, touchscreen display) ;
at least a wide area network hub ;
at least an Internet service provider ;
at least a public telecommunication network ;
at least a metropolitan network ;
at least a rural area network ;
at least a transportable communication network ;
at least a vehicular systems communication network ;
at least a computer system communication network ;
at least a network for providing at least a communication path from at least a base station to at least a communication switching center ;
at least a millimeter wave link communication path ;
at least a radio communication apparatus comprising at least a millimeter wave system ;
at least a millimeter wave antenna for producing at least millimeter wave beam .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display (photovoltaic array, electrical power, digital camera, visual device, service provider) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (photovoltaic array, electrical power, digital camera, visual device, service provider) , where the said user input (discharge cycles, output port, user input) area (photovoltaic array, electrical power, digital camera, visual device, service provider) comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (spectral efficiency) to define a personalized holding pattern (discharge cycles, output port, user input) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120032876A1
CLAIM 19
. A Communication device of claim 18 , wherein said object (area comprising vertices) further comprises at least a near solid object including a human finger ;
said movement further comprises at least one of : vertical movement ;
horizontal movement ;
diagonal movement .

US20120032876A1
CLAIM 25
. A communication device of claim 1 , wherein said power management module further comprises a cell platform comprising at least one of nickel-cadmium batteries (NiCd) ;
nickel oxide hydroxide ;
metallic cadmium ;
wafer module ;
capacitor ;
complementary metal oxide semiconductor ;
wherein said capacitor operatively configured to withstand higher number of charge/discharge cycles (holding pattern, user input, s thermal sensors) and faster charge and discharge rates .

US20120032876A1
CLAIM 39
. A communication device of claim 1 , wherein said communication apparatus further comprises circuit board comprising electronic system' ;
s applications being configured for at least one of : a wired communications device ;
a wireless communications device ;
a cell phone ;
a handheld communication device ;
laptop computer ;
desktop computer ;
telemetry device ;
a switching device ;
MP3 player ;
a router ;
a repeater , a codec ;
a LAN ;
a WLAN ;
a Bluetooth enabled device ;
a digital camera (touchscreen area, user input area, touchscreen display) ;
a digital audio player and/or recorder ;
a digital video player and/or recorder ;
a computer ;
a monitor ;
a television set ;
a satellite set top box ;
a cable modem ;
a digital automotive control system ;
a control module ;
a communication module ;
a digitally-controlled home appliance ;
a printer ;
a copier ;
a digital audio or video receiver ;
an RF transceiver ;
a personal digital assistant (PDA) ;
a digital game playing device ;
a digital testing and/or measuring device ;
a digital avionics device ;
a media device ;
a medical device ;
a digitally-controlled medical equipment .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power (touchscreen area, user input area, touchscreen display) generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer module ;
photovoltaic array (touchscreen area, user input area, touchscreen display) ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 59
. A communication device of claim 1 , further comprises an audio device configured with at least an audio/visual device (touchscreen area, user input area, touchscreen display) operable for at least one of inputting communications ;
outputting communications ;
wherein said at least one audio/visual device further comprises at least one of : a touch screen input/output device ;
at least a speaker device ;
at least a microphone device ;
said speaker device ;
at least a voice enabled communications device , including Voice Over Internet Protocol “VOIP .

US20120032876A1
CLAIM 71
. A communication device of claim 1 , wherein said antenna apparatus further comprises at least one of : at least a transceiver station ;
at least a signal routing apparatus ;
at least an apparatus operable to support higher data rates ;
at least an apparatus operable to improve spectral efficiency (usage frequency) ;
at least an apparatus operable to reduce network latency ;
at least an apparatus operable to provide flexible channel bandwidth ;
at least an apparatus operable to support flexible channel bandwidth ;
means for simplifying and/or flattening at least a communication architecture .

US20120032876A1
CLAIM 74
. A communication device of claim 1 , wherein said communication apparatus further comprising at least one of : at least an input/output device configured with at least an output port (holding pattern, user input, s thermal sensors) ;
means for generating downlink signals ;
means for transmitting downlink signals ;
means for transmitting at least a downlink signals with overlapping frequencies ;
means for generating downlink signals of different communication contents ;
at least a transceiver having at least an uplink receive port to receive at least an uplink signal ;
said at least one antenna apparatus configured with said communication apparatus for providing at least a non-overlapping coverage area .

US20120032876A1
CLAIM 76
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : means for maximizing at least a boundary coverage area ;
means for minimizing at least a boundary coverage area ;
means for accepting at least user input (holding pattern, user input, s thermal sensors) ;
at least a reconfiguration switch apparatus ;
at least a hard-wired-signal splitter means ;
at least a time division duplexed at said at least one antenna apparatus ;
at least a frequency division duplexed at said at least one antenna apparatus ;
means for minimizing boundaries between coverage areas ;
at least a duplexer apparatus ;
means for dividing communication signals into plurality communication channels ;
means for increasing cell coverage ;
means for lowering output power ;
at least a multiple input-multiple output apparatus .

US20120032876A1
CLAIM 81
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : at least a cellular base station ;
at least a WiMax base station ;
at least an LTE base station ;
at least a mobile telephone switching center ;
at least a telecommunication service provider (touchscreen area, user input area, touchscreen display) ;
at least a wide area network hub ;
at least an Internet service provider ;
at least a public telecommunication network ;
at least a metropolitan network ;
at least a rural area network ;
at least a transportable communication network ;
at least a vehicular systems communication network ;
at least a computer system communication network ;
at least a network for providing at least a communication path from at least a base station to at least a communication switching center ;
at least a millimeter wave link communication path ;
at least a radio communication apparatus comprising at least a millimeter wave system ;
at least a millimeter wave antenna for producing at least millimeter wave beam .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display (photovoltaic array, electrical power, digital camera, visual device, service provider) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors (electromagnetic signals) , wherein the heat signature forms an area comprising vertices (said object) of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (discharge cycles, output port, user input) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (spectral efficiency) to define a personalized holding pattern (discharge cycles, output port, user input) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120032876A1
CLAIM 19
. A Communication device of claim 18 , wherein said object (area comprising vertices) further comprises at least a near solid object including a human finger ;
said movement further comprises at least one of : vertical movement ;
horizontal movement ;
diagonal movement .

US20120032876A1
CLAIM 25
. A communication device of claim 1 , wherein said power management module further comprises a cell platform comprising at least one of nickel-cadmium batteries (NiCd) ;
nickel oxide hydroxide ;
metallic cadmium ;
wafer module ;
capacitor ;
complementary metal oxide semiconductor ;
wherein said capacitor operatively configured to withstand higher number of charge/discharge cycles (holding pattern, user input, s thermal sensors) and faster charge and discharge rates .

US20120032876A1
CLAIM 39
. A communication device of claim 1 , wherein said communication apparatus further comprises circuit board comprising electronic system' ;
s applications being configured for at least one of : a wired communications device ;
a wireless communications device ;
a cell phone ;
a handheld communication device ;
laptop computer ;
desktop computer ;
telemetry device ;
a switching device ;
MP3 player ;
a router ;
a repeater , a codec ;
a LAN ;
a WLAN ;
a Bluetooth enabled device ;
a digital camera (touchscreen area, user input area, touchscreen display) ;
a digital audio player and/or recorder ;
a digital video player and/or recorder ;
a computer ;
a monitor ;
a television set ;
a satellite set top box ;
a cable modem ;
a digital automotive control system ;
a control module ;
a communication module ;
a digitally-controlled home appliance ;
a printer ;
a copier ;
a digital audio or video receiver ;
an RF transceiver ;
a personal digital assistant (PDA) ;
a digital game playing device ;
a digital testing and/or measuring device ;
a digital avionics device ;
a media device ;
a medical device ;
a digitally-controlled medical equipment .

US20120032876A1
CLAIM 50
. A communication device of claim 1 , wherein said detection platform further comprises at least one of : solar panel for converting light photons into a photo generated electrical energy ;
optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electrical power (touchscreen area, user input area, touchscreen display) generating system ;
at least an energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer module ;
photovoltaic array (touchscreen area, user input area, touchscreen display) ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
CMOS antenna with meta material structured surface cavity ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology applications ;
photovoltaic module ;
intra-chip antenna network ;
multiple paths antenna network .

US20120032876A1
CLAIM 53
. A communication device of claim 1 , wherein said display apparatus further configured for at least one of : inhabit plastic shrinkage cracking ;
reduce explosive spalling in high temperature ;
reduce water migration ;
reduce permeability ;
reduce settlement cracking ;
improve cohesion ;
resist fatigue ;
resist shatter ;
resist impact ;
provide residual strength ;
connections with at least one remote electronic device (electronic device) .

US20120032876A1
CLAIM 59
. A communication device of claim 1 , further comprises an audio device configured with at least an audio/visual device (touchscreen area, user input area, touchscreen display) operable for at least one of inputting communications ;
outputting communications ;
wherein said at least one audio/visual device further comprises at least one of : a touch screen input/output device ;
at least a speaker device ;
at least a microphone device ;
said speaker device ;
at least a voice enabled communications device , including Voice Over Internet Protocol “VOIP .

US20120032876A1
CLAIM 71
. A communication device of claim 1 , wherein said antenna apparatus further comprises at least one of : at least a transceiver station ;
at least a signal routing apparatus ;
at least an apparatus operable to support higher data rates ;
at least an apparatus operable to improve spectral efficiency (usage frequency) ;
at least an apparatus operable to reduce network latency ;
at least an apparatus operable to provide flexible channel bandwidth ;
at least an apparatus operable to support flexible channel bandwidth ;
means for simplifying and/or flattening at least a communication architecture .

US20120032876A1
CLAIM 74
. A communication device of claim 1 , wherein said communication apparatus further comprising at least one of : at least an input/output device configured with at least an output port (holding pattern, user input, s thermal sensors) ;
means for generating downlink signals ;
means for transmitting downlink signals ;
means for transmitting at least a downlink signals with overlapping frequencies ;
means for generating downlink signals of different communication contents ;
at least a transceiver having at least an uplink receive port to receive at least an uplink signal ;
said at least one antenna apparatus configured with said communication apparatus for providing at least a non-overlapping coverage area .

US20120032876A1
CLAIM 76
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : means for maximizing at least a boundary coverage area ;
means for minimizing at least a boundary coverage area ;
means for accepting at least user input (holding pattern, user input, s thermal sensors) ;
at least a reconfiguration switch apparatus ;
at least a hard-wired-signal splitter means ;
at least a time division duplexed at said at least one antenna apparatus ;
at least a frequency division duplexed at said at least one antenna apparatus ;
means for minimizing boundaries between coverage areas ;
at least a duplexer apparatus ;
means for dividing communication signals into plurality communication channels ;
means for increasing cell coverage ;
means for lowering output power ;
at least a multiple input-multiple output apparatus .

US20120032876A1
CLAIM 81
. A communication device of claim 1 , wherein said communication apparatus further comprises at least one of : at least a cellular base station ;
at least a WiMax base station ;
at least an LTE base station ;
at least a mobile telephone switching center ;
at least a telecommunication service provider (touchscreen area, user input area, touchscreen display) ;
at least a wide area network hub ;
at least an Internet service provider ;
at least a public telecommunication network ;
at least a metropolitan network ;
at least a rural area network ;
at least a transportable communication network ;
at least a vehicular systems communication network ;
at least a computer system communication network ;
at least a network for providing at least a communication path from at least a base station to at least a communication switching center ;
at least a millimeter wave link communication path ;
at least a radio communication apparatus comprising at least a millimeter wave system ;
at least a millimeter wave antenna for producing at least millimeter wave beam .

US20120032876A1
CLAIM 84
. A communication device of claim 1 , wherein said antenna apparatus further comprises at least one of : at least nano-antennas ;
at least nanorectennas ;
means for enabling rectification at least at high frequencies ;
etching at least a portion of at least a silicon substrate to form at least an antenna comprising of CMOS multiple antenna on chip , wherein the antenna apparatus comprising at least an opened end and at least a shorted end ;
applying a photoresist material to the etched portion ;
embedding sensors in association with the photoresist material ;
exposing the photoresist material to ultraviolet energy ;
resonating at least in the GHz range ;
forming said antenna in at least a millimeter measurement ;
forming said antenna in at least um thickness range ;
providing means for radiating RF fields ;
providing means for at least a deferential feed in association with said means for radiating RF fields ;
providing means for feeding at least electromagnetic signals (thermal sensors) ;
at least an apparatus operable substantially without band gap limitations .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2447818A1

Filed: 2010-10-07     Issued: 2012-05-02

Method and portable electronic device for presenting text

(Original Assignee) Research in Motion Ltd     (Current Assignee) BlackBerry Ltd

Scott Peter Gammon
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen (display screen) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (display screen) .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (display screen) .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (display screen) comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen (display screen) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (display screen) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen (display screen) of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2447818A1
CLAIM 1
A method of displaying text on a portable electronic device (electronic device) , comprising : determining a location of an onscreen position indicator in text displayed on a display screen of the portable electronic device ;
and displaying a selected portion of the text in an area in relation to the location of the onscreen position indicator in enlarged text .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
CN102668522A

Filed: 2010-10-04     Issued: 2012-09-12

包括滑动显示部分的装置

(Original Assignee) Nokia Oyj     (Current Assignee) Nokia Technologies Oy

M·奥克斯曼, T·伊瓦斯科维休斯, K·埃基南, T·卡皮艾南, J·瓦纳南, K·鲁特萨莱南
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set (一组控制) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel (手持电子设备) region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (一组控制) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
CN102668522A
CLAIM 5
. 根据权利要求I所述的装置,其中所述装置包括所述副显示器上的一组控制 (first set, second set, third set) 按钮,并且所述装置被配置为基于经由所述控制按钮从所述用户接收的输入,控制所述主显示器上的信息。

CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 3
. The display system according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel (手持电子设备) region is processed as a touch-based input within the active touchscreen region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 4
. The display system according to claim 1 , wherein a touch-based input originating in the virtual bezel (手持电子设备) region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (手持电子设备) region is processed as a multi-touch input within the virtual bezel region of the display screen .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (手持电子设备) region is processed as a multi-touch input within the active touchscreen region of the display screen .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (手持电子设备) region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel (手持电子设备) region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 9
. The display system according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel (手持电子设备) region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel (手持电子设备) region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel (手持电子设备) region function to process a third set (一组控制) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
CN102668522A
CLAIM 5
. 根据权利要求I所述的装置,其中所述装置包括所述副显示器上的一组控制 (first set, second set, third set) 按钮,并且所述装置被配置为基于经由所述控制按钮从所述用户接收的输入,控制所述主显示器上的信息。

CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel (手持电子设备) display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel (手持电子设备) display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 16
. A method of defining a virtual bezel (手持电子设备) region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

US9645663B2
CLAIM 17
. A method of defining a virtual bezel (手持电子设备) region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency (的使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

CN102668522A
CLAIM 13
. —种体现在包含计算机可执行程序代码的计算机可读介质上的计算机程序,由这样 一种装置的至少一个处理器执行所述计算机可执行程序代码,其中所述装置包括滑动地附接到第二部分的第一部分,所述第一部分具有主显示器和副显示器,所述第一部分具有用于提供所述主显示器的表面,并且所述表面弯曲或弯折以形成用于提供所述副显示器的侧面或侧表面,当所述计算机可执行程序代码被所述装置执行时导致所述装置执行以下操作: 操作所述装置并且控制所述主显示器和所述副显示器的使用 (usage frequency)

US9645663B2
CLAIM 18
. A method of defining a virtual bezel (手持电子设备) region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency (的使用) to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
CN102668522A
CLAIM 12
. 根据权利要求I所述的装置,其中所述装置是移动手持电子设备 (virtual bezel, virtual bezel region, virtual bezel display screen)

CN102668522A
CLAIM 13
. —种体现在包含计算机可执行程序代码的计算机可读介质上的计算机程序,由这样 一种装置的至少一个处理器执行所述计算机可执行程序代码,其中所述装置包括滑动地附接到第二部分的第一部分,所述第一部分具有主显示器和副显示器,所述第一部分具有用于提供所述主显示器的表面,并且所述表面弯曲或弯折以形成用于提供所述副显示器的侧面或侧表面,当所述计算机可执行程序代码被所述装置执行时导致所述装置执行以下操作: 操作所述装置并且控制所述主显示器和所述副显示器的使用 (usage frequency)




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JP2012073873A

Filed: 2010-09-29     Issued: 2012-04-12

情報処理装置および入力方法

(Original Assignee) Nec Casio Mobile Communications Ltd; Necカシオモバイルコミュニケーションズ株式会社     

Koji Inami, 興志 伊波
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (タッチ) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel (前記操作) region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 3
. The display system according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel (前記操作) region is processed as a touch-based input within the active touchscreen region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 4
. The display system according to claim 1 , wherein a touch-based input originating in the virtual bezel (前記操作) region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed as a multi-touch input within the virtual bezel region of the display screen (タッチ) .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed as a multi-touch input within the active touchscreen region of the display screen (タッチ) .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel (前記操作) region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 9
. The display system according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel (前記操作) region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel (前記操作) region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (タッチ) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel (前記操作) region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel (前記操作) display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel (前記操作) display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

JP2012073873A
CLAIM 2
タッチ (display screen, screen mode) パネルを備えたディスプレイ部を有し、 前記操作キーは、前記ディスプレイ部に表示されたソフトウェアキーである、請求項1に記載の情報処理装置。

US9645663B2
CLAIM 16
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
JP2012073873A
CLAIM 1
操作キーと、 前記操作 (virtual bezel) キーにかかる押圧力を検知する圧力検知部と、 複数の押圧力範囲のそれぞれに機能を対応付けて記憶する記憶部と、 前記押圧力が同一の前記押圧力範囲に所定時間以上含まれていると、当該押圧力範囲に対応する機能を実行する制御部と、を有する情報処理装置。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120244348A1

Filed: 2010-09-28     Issued: 2012-09-27

Touch panel

(Original Assignee) LG Chem Ltd     (Current Assignee) LG Chem Ltd

Min Soo Park, Se Woo YANG, Woo Ha Kim, Yoon Tae Hwang, Suk Ky Chang
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (containing nitrogen, molecular weight) intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120244348A1
CLAIM 6
. The touch panel according to claim 1 , wherein the acrylic resin has a weight average molecular weight (user input, user input area) of 300 , 000 to 1 , 500 , 000 .

US20120244348A1
CLAIM 8
. The touch panel according to claim 7 , wherein the cross-linkable functional group is a hydroxy group , a carboxyl group , a functional group containing nitrogen (user input, user input area) , an epoxy group or an isocyanate group .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input (containing nitrogen, molecular weight) intended to affect the display of the first portion of the content on the active touchscreen region .
US20120244348A1
CLAIM 6
. The touch panel according to claim 1 , wherein the acrylic resin has a weight average molecular weight (user input, user input area) of 300 , 000 to 1 , 500 , 000 .

US20120244348A1
CLAIM 8
. The touch panel according to claim 7 , wherein the cross-linkable functional group is a hydroxy group , a carboxyl group , a functional group containing nitrogen (user input, user input area) , an epoxy group or an isocyanate group .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input (containing nitrogen, molecular weight) in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120244348A1
CLAIM 6
. The touch panel according to claim 1 , wherein the acrylic resin has a weight average molecular weight (user input, user input area) of 300 , 000 to 1 , 500 , 000 .

US20120244348A1
CLAIM 8
. The touch panel according to claim 7 , wherein the cross-linkable functional group is a hydroxy group , a carboxyl group , a functional group containing nitrogen (user input, user input area) , an epoxy group or an isocyanate group .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input (containing nitrogen, molecular weight) area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120244348A1
CLAIM 6
. The touch panel according to claim 1 , wherein the acrylic resin has a weight average molecular weight (user input, user input area) of 300 , 000 to 1 , 500 , 000 .

US20120244348A1
CLAIM 8
. The touch panel according to claim 7 , wherein the cross-linkable functional group is a hydroxy group , a carboxyl group , a functional group containing nitrogen (user input, user input area) , an epoxy group or an isocyanate group .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input (containing nitrogen, molecular weight) in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120244348A1
CLAIM 6
. The touch panel according to claim 1 , wherein the acrylic resin has a weight average molecular weight (user input, user input area) of 300 , 000 to 1 , 500 , 000 .

US20120244348A1
CLAIM 8
. The touch panel according to claim 7 , wherein the cross-linkable functional group is a hydroxy group , a carboxyl group , a functional group containing nitrogen (user input, user input area) , an epoxy group or an isocyanate group .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120182249A1

Filed: 2010-09-22     Issued: 2012-07-19

Mount structure of touch input device having pressure sensitive sensor

(Original Assignee) Nissha Printing Co Ltd     (Current Assignee) Nissha Printing Co Ltd

Yuko Endo, Yuichiro Takai, Yoshihiro Kai, Takahiro Suzuki
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (touch panel) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set (shaped electrodes) of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (shaped electrodes) of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes (second set, second portion) arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch panel) .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch panel) .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (touch panel) comprises an electronic device status display panel (transparent window) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .

US20120182249A1
CLAIM 9
. The mount structure of the touch input device having the pressure sensitive sensor , according to claim 1 , wherein each of the first substrate and the second substrate is formed into a flat plate with a transparent material , and a transparent window (electronic device status display panel) part is formed in a part except for the pair of electrodes .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (transparent window) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120182249A1
CLAIM 9
. The mount structure of the touch input device having the pressure sensitive sensor , according to claim 1 , wherein each of the first substrate and the second substrate is formed into a flat plate with a transparent material , and a transparent window (electronic device status display panel) part is formed in a part except for the pair of electrodes .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch panel) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion (shaped electrodes) of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes (second set, second portion) arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch panel) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120182249A1
CLAIM 1
. A mount structure of a touch input device comprising : a touch input device having at least a touch panel (display screen, screen mode) , and a pressure sensitive sensor bonded to a lower surface of the touch panel ;
and a casing dented to have a level difference to allow the touch input device to be externally fitted in , and having a concave part or an opening part for a display device , and a frame-shaped support part to support a back surface peripheral part of the touch input device , in a bottom surface thereof , wherein the pressure sensitive sensor comprises : a first substrate ;
a second substrate arranged so as to be opposed to the first substrate and bonded to the lower surface of the touch panel ;
a pair of frame-shaped electrodes arranged on one or both surfaces of a surface of the first substrate opposed to the second substrate and a surface of the second substrate opposed to the first substrate , along an edge part of the first or second substrate ;
a conductive pressure sensitive ink member arranged on the surface of the first substrate opposed to the second substrate or the surface of the second substrate opposed to the first substrate so as to be apart from at least one of the pair of electrodes , and so as to be along the edge part of the first or second substrate , and having electric characteristics to be changed by a pressed force applied ;
a gap holding member arranged in a region between the first substrate and the second substrate , to bond the first substrate and the second substrate with its adhesiveness , and hold a gap between the pressure sensitive ink member and at least one of the pair of electrodes ;
and a pressure concentration member laminated and arranged in a shape of a dot on a surface opposite to the surface opposed to the second substrate , of the first substrate so as to support the pressure sensitive ink member , wherein a frame-shaped gasket is attached between the pressure sensitive sensor and the support part of the casing , and the gasket does not overlap with the pressure concentration member .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
JP2012058910A

Filed: 2010-09-07     Issued: 2012-03-22

携帯端末装置及びプログラム

(Original Assignee) Nec Corp; 日本電気株式会社     

Kentaro Ozawa, 健太郎 小澤
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen (画面全体) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel (前記操作) region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 3
. The display system according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel (前記操作) region is processed as a touch-based input within the active touchscreen region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 4
. The display system according to claim 1 , wherein a touch-based input originating in the virtual bezel (前記操作) region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 5
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed as a multi-touch input within the virtual bezel region of the display screen (画面全体) .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 6
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed as a multi-touch input within the active touchscreen region of the display screen (画面全体) .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel (前記操作) region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel (前記操作) region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 9
. The display system according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel (前記操作) region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 11
. The display system according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel (前記操作) region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen (画面全体) comprises an electronic device status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel (前記操作) region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (画面全体) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel (前記操作) region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (画面全体) , the gestural software application configured to produce the second mode of response in the virtual bezel (前記操作) region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

JP2012058910A
CLAIM 5
前記片手操作モードは、前記表示画面全体 (display screen, screen mode) に表示する表示内容を縮小して表示し、前記縮小表示に対するユーザの操作内容を受け付けることを特徴とする請求項1から4のいずれか1項に記載の携帯端末装置。

US9645663B2
CLAIM 16
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 17
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。

US9645663B2
CLAIM 18
. A method of defining a virtual bezel (前記操作) region of an electronic device having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
JP2012058910A
CLAIM 3
前記操作 (virtual bezel) モード切り替え手段は、操作モードが両手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を下回った場合に片手操作モードに切り替え、操作モードが片手操作モードの状態で前記抵抗値測定手段により測定された抵抗値が閾値を上回った場合に両手操作モードに切り替えることを特徴とする請求項1又は2に記載の携帯端末装置。




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120034954A1

Filed: 2010-08-07     Issued: 2012-02-09

Mega communication and media apparatus configured to prevent brain cancerous deseases and to generate electrical energy

(Original Assignee) Joseph Akwo Tabe     

Joseph Akwo Tabe
US9645663B2
CLAIM 1
. A display system (wireless signals) for an electronic device (electronic device, sound waves) comprising : a touch-sensitive display screen (electronic wafer) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode (thin film solar cell, electronic book, graphic user) of response to a first set (telecommunications system) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion (textile fibers) of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120034954A1
CLAIM 1
. An energy harvesting computer device for use in association with a communication device configured with signal booster apparatus , comprising : at least a communication apparatus ;
at least an input/output module ;
at least a microprocessor comprising a control logic configured with a . software for controlling communications and for processing characters from said input (second mode) /out module ;
at least an antenna apparatus in communication with said at least one microprocessor ;
and at least a sensor apparatus embedded in silicon substrate and etched/fused in nanofiber/microfiber material to provide a detection platform with efficient detection selectivity and detection sensitivity .

US20120034954A1
CLAIM 2
. A computer device for use in association with a communication device claim 1 , wherein said at least one input/output module comprises a computer apparatus configured for providing character processing , each said input/output module comprising graphic user (first mode, screen mode) interface configured with at least one of : a display apparatus ;
at least a keyboard ;
at least a force/pressure responsive apparatus , at least a vibration responsive apparatus means for converting pressure ;
wherein said force/vibration apparatus further configured for generating electrical energy .

US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 12
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprises at least one of : a display device ;
an input/out device ;
digital video broadcast entertainment , digital audio broadcast , digital multimedia-broadcast , global positioning system , means for providing safety services , transportation road communication systems , universal mobile telecommunications system (first set) , touch screen input/output device operable for interactive communications ;
a pressure responsive energy producing device .

US20120034954A1
CLAIM 15
. A computer device for use in association with a communication device of claim 1 , wherein said power source further include at least one of : carbon char , carbon black , metal sulfides , metal oxides , organic materials , textile fibers (first portion) , zinc oxide (ZnO) , nano-wires , piezoelectric crystals , piezoelectric elements , sensory layer , wet etching , dry etching , electron-silicon substrate-oxide , metal oxide semiconductor , optical properties , glass fiber , substrate micro fiber , substrate nano-fiber , FPGA meta material structure , cell platform , solar cell , cell platform , nickel-cadmium batteries (NiCd) , nickel oxide hydroxide , metallic cadmium , wafer module , a capacitor module operatively configured to withstand higher number of charge/discharge cycles and faster charge and discharge rates , at least one power source further comprises a material being alloyed with nano-fiber/microfiber material .

US20120034954A1
CLAIM 16
. A computer device for use in association with a communication device of claim 1 , wherein said communication apparatus further comprises at least one of : signal amplifier comprising at least a variable gain module , social network platform , video recognition platform ;
voice over text platform ;
text to voice enabled/conversion platform ;
TDMA platform ;
WCDMA platform ;
CDMA platform ;
TDMB platform ;
digital/analog/GSM platform ;
GPS platform GPRS platform ;
TIHW platform ;
MFSCD platform ;
frequency authentication platform ;
multiple input/output platform ;
EDGSM platform ;
EDMA platform ;
OFDM platform ;
OFDMA platform ;
Wi-Fi platform ;
Wi-Max platform ;
wireless library platform ;
educational module ;
touch screen sensory platform ;
phone book ;
electronic book (first mode, screen mode) ;
electronic reader ;
dictionary ;
calendar ;
calculator ;
Internet service applications ;
energy generating apparatus ;
gaming apparatus ;
and/or Internet service connectivity operable for global roaming .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US20120034954A1
CLAIM 39
. The communication device of claim 1 , wherein said cell platform further comprises at least a thin film solar cell (first mode, screen mode) .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 2
. The display system (wireless signals) according to claim 1 , wherein the gestural software application is configured to produce the first mode (thin film solar cell, electronic book, graphic user) of response in the active touchscreen region .
US20120034954A1
CLAIM 2
. A computer device for use in association with a communication device claim 1 , wherein said at least one input/output module comprises a computer apparatus configured for providing character processing , each said input/output module comprising graphic user (first mode, screen mode) interface configured with at least one of : a display apparatus ;
at least a keyboard ;
at least a force/pressure responsive apparatus , at least a vibration responsive apparatus means for converting pressure ;
wherein said force/vibration apparatus further configured for generating electrical energy .

US20120034954A1
CLAIM 16
. A computer device for use in association with a communication device of claim 1 , wherein said communication apparatus further comprises at least one of : signal amplifier comprising at least a variable gain module , social network platform , video recognition platform ;
voice over text platform ;
text to voice enabled/conversion platform ;
TDMA platform ;
WCDMA platform ;
CDMA platform ;
TDMB platform ;
digital/analog/GSM platform ;
GPS platform GPRS platform ;
TIHW platform ;
MFSCD platform ;
frequency authentication platform ;
multiple input/output platform ;
EDGSM platform ;
EDMA platform ;
OFDM platform ;
OFDMA platform ;
Wi-Fi platform ;
Wi-Max platform ;
wireless library platform ;
educational module ;
touch screen sensory platform ;
phone book ;
electronic book (first mode, screen mode) ;
electronic reader ;
dictionary ;
calendar ;
calculator ;
Internet service applications ;
energy generating apparatus ;
gaming apparatus ;
and/or Internet service connectivity operable for global roaming .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US20120034954A1
CLAIM 39
. The communication device of claim 1 , wherein said cell platform further comprises at least a thin film solar cell (first mode, screen mode) .

US9645663B2
CLAIM 3
. The display system (wireless signals) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 4
. The display system (wireless signals) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 5
. The display system (wireless signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (electronic wafer) .
US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 6
. The display system (wireless signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (electronic wafer) .
US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 7
. The display system (wireless signals) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device, sound waves) for the gestural hardware on how a multi-touch input will be processed .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 8
. The display system (wireless signals) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 9
. The display system (wireless signals) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 10
. The display system (wireless signals) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 11
. The display system (wireless signals) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US9645663B2
CLAIM 12
. The display system (wireless signals) according to claim 9 , wherein the display screen (electronic wafer) comprises an electronic device (electronic device, sound waves) status display panel displaying at least one information item from a set of information items (input data) corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 12
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprises at least one of : a display device ;
an input/out device ;
digital video broadcast entertainment , digital audio broadcast , digital multimedia-broadcast , global positioning system , means for providing safety services , transportation road communication systems , universal mobile telecommunications system , touch screen (electronic device status display panel) input/output device operable for interactive communications ;
a pressure responsive energy producing device .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals (display system) associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value .

US20120034954A1
CLAIM 33
. The communication device of claim 1 , wherein said communication apparatus further comprises an integrated circuit chip comprising at least one of : a circuitry in association with the antenna ;
an energy harvesting device ;
means for converting non-electrical energy to electrical energy ;
at least an environmental sensory circuitry configured to perform at least one of detection , communication ;
communicating at least a characteristic associated with an external environment ;
receiving at least an acoustical input data (information items) ;
receiving at least an electrical impulse input data ;
at least a wireless communications circuitry operable for transmitting characteristic between the environmental sensory circuitry and the wireless communications circuitry .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 13
. The electronic device (electronic device, sound waves) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set (secondary e) of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 12
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprises at least one of : a display device ;
an input/out device ;
digital video broadcast entertainment , digital audio broadcast , digital multimedia-broadcast , global positioning system , means for providing safety services , transportation road communication systems , universal mobile telecommunications system , touch screen (electronic device status display panel) input/output device operable for interactive communications ;
a pressure responsive energy producing device .

US20120034954A1
CLAIM 13
. A computer device for use in association with a communication device of claim 1 , wherein said at least one communication apparatus comprises sensors embedded in silicon substrate and etched/fused in nano-fiber/microfiber material to provide at least one of : energy harvester apparatus ;
energy conversion device , sensory display/input device , interactive communication device , intelligence detection device , radiation prevention device , non cancerous communication device , secondary e (third set) nergy platform , primary energy platform , accelerated data processing device , solar energy to electrical energy conversion device , objects movement detection device , electronic document translation device , touch screen display device .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 14
. An electronic device (electronic device, sound waves) comprising : a handheld interactive electronic device having a virtual bezel display screen (electronic wafer) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode (thin film solar cell, electronic book, graphic user) of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion (textile fibers) of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode (said input) of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120034954A1
CLAIM 1
. An energy harvesting computer device for use in association with a communication device configured with signal booster apparatus , comprising : at least a communication apparatus ;
at least an input/output module ;
at least a microprocessor comprising a control logic configured with a . software for controlling communications and for processing characters from said input (second mode) /out module ;
at least an antenna apparatus in communication with said at least one microprocessor ;
and at least a sensor apparatus embedded in silicon substrate and etched/fused in nanofiber/microfiber material to provide a detection platform with efficient detection selectivity and detection sensitivity .

US20120034954A1
CLAIM 2
. A computer device for use in association with a communication device claim 1 , wherein said at least one input/output module comprises a computer apparatus configured for providing character processing , each said input/output module comprising graphic user (first mode, screen mode) interface configured with at least one of : a display apparatus ;
at least a keyboard ;
at least a force/pressure responsive apparatus , at least a vibration responsive apparatus means for converting pressure ;
wherein said force/vibration apparatus further configured for generating electrical energy .

US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 15
. A computer device for use in association with a communication device of claim 1 , wherein said power source further include at least one of : carbon char , carbon black , metal sulfides , metal oxides , organic materials , textile fibers (first portion) , zinc oxide (ZnO) , nano-wires , piezoelectric crystals , piezoelectric elements , sensory layer , wet etching , dry etching , electron-silicon substrate-oxide , metal oxide semiconductor , optical properties , glass fiber , substrate micro fiber , substrate nano-fiber , FPGA meta material structure , cell platform , solar cell , cell platform , nickel-cadmium batteries (NiCd) , nickel oxide hydroxide , metallic cadmium , wafer module , a capacitor module operatively configured to withstand higher number of charge/discharge cycles and faster charge and discharge rates , at least one power source further comprises a material being alloyed with nano-fiber/microfiber material .

US20120034954A1
CLAIM 16
. A computer device for use in association with a communication device of claim 1 , wherein said communication apparatus further comprises at least one of : signal amplifier comprising at least a variable gain module , social network platform , video recognition platform ;
voice over text platform ;
text to voice enabled/conversion platform ;
TDMA platform ;
WCDMA platform ;
CDMA platform ;
TDMB platform ;
digital/analog/GSM platform ;
GPS platform GPRS platform ;
TIHW platform ;
MFSCD platform ;
frequency authentication platform ;
multiple input/output platform ;
EDGSM platform ;
EDMA platform ;
OFDM platform ;
OFDMA platform ;
Wi-Fi platform ;
Wi-Max platform ;
wireless library platform ;
educational module ;
touch screen sensory platform ;
phone book ;
electronic book (first mode, screen mode) ;
electronic reader ;
dictionary ;
calendar ;
calculator ;
Internet service applications ;
energy generating apparatus ;
gaming apparatus ;
and/or Internet service connectivity operable for global roaming .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 39
. The communication device of claim 1 , wherein said cell platform further comprises at least a thin film solar cell (first mode, screen mode) .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 15
. The electronic device (electronic device, sound waves) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (electronic wafer) , the gestural software application configured to produce the second mode (said input) of response in the virtual bezel region .
US20120034954A1
CLAIM 1
. An energy harvesting computer device for use in association with a communication device configured with signal booster apparatus , comprising : at least a communication apparatus ;
at least an input/output module ;
at least a microprocessor comprising a control logic configured with a . software for controlling communications and for processing characters from said input (second mode) /out module ;
at least an antenna apparatus in communication with said at least one microprocessor ;
and at least a sensor apparatus embedded in silicon substrate and etched/fused in nanofiber/microfiber material to provide a detection platform with efficient detection selectivity and detection sensitivity .

US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer (display screen) module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device, sound waves) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device, sound waves) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area (photovoltaic array, electrical power, digital camera, electric field, service provider) , where the said user input area (photovoltaic array, electrical power, digital camera, electric field, service provider) comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (discharge cycles) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 15
. A computer device for use in association with a communication device of claim 1 , wherein said power source further include at least one of : carbon char , carbon black , metal sulfides , metal oxides , organic materials , textile fibers , zinc oxide (ZnO) , nano-wires , piezoelectric crystals , piezoelectric elements , sensory layer , wet etching , dry etching , electron-silicon substrate-oxide , metal oxide semiconductor , optical properties , glass fiber , substrate micro fiber , substrate nano-fiber , FPGA meta material structure , cell platform , solar cell , cell platform , nickel-cadmium batteries (NiCd) , nickel oxide hydroxide , metallic cadmium , wafer module , a capacitor module operatively configured to withstand higher number of charge/discharge cycles (holding pattern) and faster charge and discharge rates , at least one power source further comprises a material being alloyed with nano-fiber/microfiber material .

US20120034954A1
CLAIM 17
. A computer device for use in association with a communication device of claim 1 , wherein said communication apparatus further comprises circuit board comprising electronic system' ;
s applications being configured for at least one of : a wired communications device , a wireless communications device , a cell phone , a handheld communication device , laptop computer , desktop computer , telemetry device , a switching device , MP3 player , a router , a repeater , a codec , a LAN , a WLAN , a Bluetooth enabled device , a digital camera (touchscreen area, user input area) , a digital audio player and/or recorder , a digital video player and/or recorder , a computer , a monitor , a television set , a satellite set top box , a cable modem , a digital automotive control system , a control module , a communication module , a digitally-controlled home appliance , a printer , a copier , a digital audio or video receiver , an RF transceiver , a personal digital assistant (PDA) , a digital game playing device , a digital testing and/or measuring device , a digital avionics device , a media device , a medical device , and a digitally-controlled medical equipment .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer module ;
photovoltaic array (touchscreen area, user input area) ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 28
. The communication device of claim 1 , wherein said communication apparatus further comprising at least one of : means for providing wireless interconnectivity for at least one of computer device ;
cell phone apparatus ;
gaming device ;
media device ;
entertainment device ;
to at least a worldwide computer network , said worldwide computer network further comprising at least a local internet service provider (touchscreen area, user input area) portal to said network and/or at least a satellite network ;
each operable for allocating at least one cell within at least a frequency threshold value ;
at least a time map is further allocated based on said threshold value , wherein said threshold value is either above or below and/or equal to the determined signal strength of the plurality of adjacent cells .

US20120034954A1
CLAIM 40
. The communication device of claim 1 , wherein silicon substrate further comprises at least one of : a transparent material ;
an electrically conductive material ;
an electrochromic element ;
an electrochromic unit ;
means for producing photovoltaic electric field (touchscreen area, user input area) .

US20120034954A1
CLAIM 41
. The communication device of claim 1 , wherein said cell platform further comprises at least a charge platform comprising at least one of : a charge circuit ;
means for controlling at least a charge to at least a cell means ;
means for controlling at least a charge to an energy storage device ;
means for outputting electrical energy ;
means for converting at least energy within an environment into at least electrical power (touchscreen area, user input area) .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device, sound waves) having a touchscreen display , the method comprising : receiving a heat signature (electric power, energy value) from a user' ;

s hand (threshold value) holding the electronic device utilizing device' ;

s thermal sensors (electromagnetic signals) , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern (discharge cycles) for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120034954A1
CLAIM 11
. A computer device for use in association with a communication device of claim 1 , wherein said at least one detection platform further comprising at least one of : a mobile phone case ;
a mobile phone housing ;
a mobile phone circuitry ;
at least a housing for an electronic device (electronic device) ;
at least a case for an electronic device ;
at least a housing for said communication apparatus ;
at least a circuit board for said communication apparatus ;
each said detection platform operable for generating electrical energy .

US20120034954A1
CLAIM 15
. A computer device for use in association with a communication device of claim 1 , wherein said power source further include at least one of : carbon char , carbon black , metal sulfides , metal oxides , organic materials , textile fibers , zinc oxide (ZnO) , nano-wires , piezoelectric crystals , piezoelectric elements , sensory layer , wet etching , dry etching , electron-silicon substrate-oxide , metal oxide semiconductor , optical properties , glass fiber , substrate micro fiber , substrate nano-fiber , FPGA meta material structure , cell platform , solar cell , cell platform , nickel-cadmium batteries (NiCd) , nickel oxide hydroxide , metallic cadmium , wafer module , a capacitor module operatively configured to withstand higher number of charge/discharge cycles (holding pattern) and faster charge and discharge rates , at least one power source further comprises a material being alloyed with nano-fiber/microfiber material .

US20120034954A1
CLAIM 20
. A computer device for use in association with a communication device of claim 1 , wherein said detection platform further comprises at least one of solar panel for converting light photons to a photo generating electrical energy , optical elements ;
a light shield film ;
a UV curing resin ;
at least a transparent support substrate ;
at least a plate ;
at least an electric power (heat signature) generating system ;
at least energy management apparatus ;
a heating and/or cooling module ;
method for manufacturing an electronic wafer module ;
photovoltaic array ;
solar module ;
solar cell ;
mono-crystalline silicon wafer ;
fuel cell , metal-ceramic membranes , film composite metal-ceramic materials , thin film ;
polymer ;
amplified signal transmitter/receiver ;
power generator engine ;
nanotechnology ;
photovoltaic module ;
at least an energy harvester ;
at least a nano-rectifier .

US20120034954A1
CLAIM 21
. A communication device of claim 1 , wherein said communication apparatus further comprising at least a wireless communication spectrum operable for at least one of : receiving one or more wireless signals associated with at least a frequency within the wireless communication spectrum ;
determining at least a signal strength for the received wireless signals ;
determining at least a signal strength for at least a cell within the frequency ;
allocating the at least one cell for enabling wireless transmission based on at least a predetermined threshold value (s hand) .

US20120034954A1
CLAIM 29
. The communication device of claim 1 , wherein said communication apparatus further comprising means for monitoring at least one of : at least a communication spectrum ;
at least a frequency ;
at least a time cell for a received wireless signal ;
at least a wavelet coefficients corresponding to the received wireless signal ;
at least an average energy based on at least a determined wavelet coefficients ;
at least a threshold value exceeding at least a predetermined energy value (heat signature) ;
at least a light emitting diode means ;
at least a diode controlling at least a form of energy ;
at least a wireless signal ;
at least a scalable cell ;
at least a channel within the wireless communication spectrum .

US20120034954A1
CLAIM 45
. The communication device of claim 1 , wherein said communication apparatus further comprises at least one of : a communication circuit board comprising solar cell platform ;
a CMOS multiple antenna on chip in communication with a rectifier for converting electromagnetic wave into electrical power ;
at least an antenna comprising nano-wire antenna in association with the rectifier ;
at least a nano-wire comprising a material for exhibiting good electrical properties operable for transmitting and for receiving electromagnetic signals (thermal sensors) ;
further comprises at least a gold material for receiving and for transmitting electromagnetic signals at higher frequencies ;
at least a touch screen comprising a sensory platform configured with at least one of : piezoelectric elements , MEMS , load cell , strain gauge , acoustic sensor , for converting pressure force and/or sound waves (electronic device) into electrical energy ;
at least a case comprising solar cell platform in association with a rectifier device for converting solar energy into electrical power ;
at least a case comprising nano sensors embedded in silicon substrate and alloyed with meta-material structure cavity and fused/etched in nano-fiber/microfiber material to exhibit a sensory platform for converting at least a form of energy within an environment into electrical energy .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120154328A1

Filed: 2010-08-05     Issued: 2012-06-21

Input apparatus

(Original Assignee) Kyocera Corp     (Current Assignee) Kyocera Corp

Kenji Kono
US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (display unit) using predefined set of gestures to toggle a full-screen mode .
US20120154328A1
CLAIM 1
. An input apparatus comprising : a touch sensor configured to detect a contact ;
a load detection unit configured to detect a pressure load on a touch face of the touch sensor ;
a display unit (status bar visibility) configured to display a slide bar ;
a tactile sensation providing unit configured to vibrate the touch face ;
and a control unit configured to control the tactile sensation providing unit such that a tactile sensation is provided to an object pressing the touch face based on a position of a knob of the slide bar shifted in response to the contact detected by the touch sensor while the pressure load detected by the load detection unit satisfies a predetermined standard .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US20120019448A1

Filed: 2010-07-22     Issued: 2012-01-26

User Interface with Touch Pressure Level Sensing

(Original Assignee) Nokia Oyj     (Current Assignee) Nokia Oyj

Petri Sakari Pitkanen, Eero IImo Olavi Kukko, Gary Wingett, Daniel Feng
US9645663B2
CLAIM 1
. A display system (control pad) for an electronic device comprising : a touch-sensitive display screen (touch panel function) configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 2
. The display system (control pad) according to claim 1 , wherein the gestural software application is configured to produce the first mode of response in the active touchscreen region .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 3
. The display system (control pad) according to claim 1 , wherein a touch-based input originating in the active touchscreen region and terminating in the virtual bezel region is processed as a touch-based input within the active touchscreen region .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 4
. The display system (control pad) according to claim 1 , wherein a touch-based input originating in the virtual bezel region and terminating in the active touchscreen region is processed as a touch-based input within the virtual bezel region .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 5
. The display system (control pad) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the virtual bezel region of the display screen (touch panel function) .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 6
. The display system (control pad) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed as a multi-touch input within the active touchscreen region of the display screen (touch panel function) .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 7
. The display system (control pad) according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device for the gestural hardware on how a multi-touch input will be processed .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 8
. The display system (control pad) according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 9
. The display system (control pad) according to claim 1 , wherein a pre-defined set of touch-based soft buttons resides in the virtual bezel region , and wherein the user can reposition at least one touch-based soft button from the pre-defined set of touch-based soft buttons within the virtual bezel region .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 10
. The display system (control pad) according to claim 9 , wherein the user can toggle at least one touch-based soft button from the pre-defined set of touch-based soft buttons between a visible mode and a hidden mode .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 11
. The display system (control pad) according to claim 9 , wherein the user can add one or more touch-based soft buttons within the virtual bezel region .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 12
. The display system (control pad) according to claim 9 , wherein the display screen (touch panel function) comprises an electronic device status display panel (control pad) displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 13
. The electronic device according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel (control pad) and the pre-defined set of touch-based soft buttons are in a hidden mode .
US20120019448A1
CLAIM 19
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel functioning as media player control pad (display system, electronic device status display panel) .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen (touch panel function) , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US9645663B2
CLAIM 15
. The electronic device according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen (touch panel function) , the gestural software application configured to produce the second mode of response in the virtual bezel region .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (touch panel function) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (touch panel function) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (touch panel function) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US20120019448A1
CLAIM 18
. A method as in claim 15 wherein , when the touch pressure is determined to be the second level , the touch panel function (display screen, touchscreen display) ing as a mouse touchpad of an apparatus .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
US8270148B2

Filed: 2010-07-07     Issued: 2012-09-18

Suspension for a pressure sensitive touch display or panel

(Original Assignee) VATTERLEDENS INVEST AB     (Current Assignee) Apple Inc

David Griffith, Gary Smith, Mark Lackey, Anders Mölne
US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
US8270148B2
CLAIM 8
. The suspension system of claim 1 , wherein the at least one force sensor comprises one (operating system status bar) from the group consisting of a strain gauge , a load cell , and a force sensing resistor .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (sensitive touch) , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
US8270148B2
CLAIM 1
. A suspension system for mounting a force sensitive touch (touchscreen display) display or panel having a touch surface defining an x-y plane with a normal z-axis , comprising : a frame ;
a suspension membrane connected to the touch surface and the frame for suspending the touch surface from the frame , the membrane allowing freedom of movement of the touch surface along the z-axis and resisting movement of the touch surface within the x-y plane ;
and at least one force sensor connected beneath the touch surface , whereby the touch surface is pre-loaded against the at least one force sensor .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (sensitive touch) , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US8270148B2
CLAIM 1
. A suspension system for mounting a force sensitive touch (touchscreen display) display or panel having a touch surface defining an x-y plane with a normal z-axis , comprising : a frame ;
a suspension membrane connected to the touch surface and the frame for suspending the touch surface from the frame , the membrane allowing freedom of movement of the touch surface along the z-axis and resisting movement of the touch surface within the x-y plane ;
and at least one force sensor connected beneath the touch surface , whereby the touch surface is pre-loaded against the at least one force sensor .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device having a touchscreen display (sensitive touch) , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
US8270148B2
CLAIM 1
. A suspension system for mounting a force sensitive touch (touchscreen display) display or panel having a touch surface defining an x-y plane with a normal z-axis , comprising : a frame ;
a suspension membrane connected to the touch surface and the frame for suspending the touch surface from the frame , the membrane allowing freedom of movement of the touch surface along the z-axis and resisting movement of the touch surface within the x-y plane ;
and at least one force sensor connected beneath the touch surface , whereby the touch surface is pre-loaded against the at least one force sensor .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2386935A1

Filed: 2010-05-14     Issued: 2011-11-16

Method of providing tactile feedback and electronic device

(Original Assignee) Research in Motion Ltd     (Current Assignee) BlackBerry Ltd

Kuo-Feng Tong, Katarina Pavlikova, Yingying Lu, Arnett Ryan Weber
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar (comprises one) resides in the virtual bezel region , and wherein the user can toggle the status bar visibility using predefined set of gestures to toggle a full-screen mode .
EP2386935A1
CLAIM 11
The electronic device according to claim 9 , wherein the actuator comprises one (operating system status bar) or more piezoelectric actuators .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2386935A1
CLAIM 7
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2375309A1

Filed: 2010-04-08     Issued: 2011-10-12

Handheld device with localized delays for triggering tactile feedback

(Original Assignee) Research in Motion Ltd     (Current Assignee) BlackBerry Ltd

Kuo-Feng Tong, Ryan Arnett Weber, Jerome Pasquero, Derek Raymond Solven, Katarina Pavlikova
US9645663B2
CLAIM 1
. A display system for an electronic device (electronic device) comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer with a first mode of response to a first set of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 7
. The display system according to claim 1 , wherein a multi-touch input originating simultaneously in the active touchscreen region and the virtual bezel region is processed according to an instruction made by user of the electronic device (electronic device) for the gestural hardware on how a multi-touch input will be processed .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 12
. The display system according to claim 9 , wherein the display screen comprises an electronic device (electronic device) status display panel displaying at least one information item from a set of information items corresponding to a status of the electronic device , and wherein the user can toggle the electronic device status display panel between a visible mode and a hidden mode .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 13
. The electronic device (electronic device) according to claim 12 , wherein the active touchscreen region and the virtual bezel region function to process a third set of touch-based inputs from a user of the electronic device , the third set of touch-based inputs allowing the user to navigate the electronic device when the electronic device status display panel and the pre-defined set of touch-based soft buttons are in a hidden mode .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 14
. An electronic device (electronic device) comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 15
. The electronic device (electronic device) according to claim 14 further comprising non-transitory memory storing a gestural software application in communication with the virtual bezel display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 16
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : detecting a region of the touchscreen display in contact with fingers of a user holding the electronic device ;

registering the detected region as the virtual bezel region in a memory of the electronic device ;

receiving touch-based user input in the virtual bezel region ;

and interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display ;

offering the user to instruct the system what type of response to execute ;

and registering the user' ;

s response instruction in a memory of the electronic device for the detected region as personalized behavior for the virtual bezel region .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 17
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving an unintentional touch-based input from a user holding the electronic device in the touchscreen area , where the said user input area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing said polygonal area ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .

US9645663B2
CLAIM 18
. A method of defining a virtual bezel region of an electronic device (electronic device) having a touchscreen display , the method comprising : receiving a heat signature from a user' ;

s hand (threshold value) holding the electronic device utilizing device' ;

s thermal sensors , wherein the heat signature forms an area comprising vertices of a polygonal area on the touchscreen display ;

registering the polygonal area in a memory of the electronic device ;

detecting the frequency of accessing the polygonal area ;

receiving touch-based user input in the virtual bezel region ;

interpreting the received touch-based user input within the virtual bezel region as intentional user input intended to affect the display of content on the touchscreen display outside of the virtual bezel region ;

using the polygonal area registered in memory and its detected usage frequency to define a personalized holding pattern for the user of the electronic device ;

and registering a personalized holding pattern in a memory of the electronic device to define a virtual bezel region of said electronic device .
EP2375309A1
CLAIM 1
A method comprising : detecting a touch at a touch location on a touch-sensitive display ;
identifying a delay associated with the touch location ;
when a force value related to the touch meets a first threshold value (s hand) , providing a first tactile feedback after waiting the delay .

EP2375309A1
CLAIM 13
A computer-readable medium having computer-readable code executable by at least one processor of a portable electronic device (electronic device) to perform the method of claim 1 .




US9645663B2

Filed: 2013-03-24     Issued: 2017-05-09

Electronic display with a virtual bezel

(Original Assignee) Belisso LLC     (Current Assignee) Onscreen Dynamics LLC

Sergey Mavrody
EP2375314A1

Filed: 2010-04-08     Issued: 2011-10-12

Touch-sensitive device and method of control

(Original Assignee) Research in Motion Ltd     (Current Assignee) BlackBerry Ltd

Jason Tyler Griffin, Perry Allan Faubert
US9645663B2
CLAIM 1
. A display system for an electronic device comprising : a touch-sensitive display screen configured to display content to a user of the electronic device ;

an active touchscreen region of the display screen having a touchscreen layer (first function) with a first mode of response to a first set (second function) of touch-based inputs from the user of the electronic device , the active touchscreen region configured to display a first portion of the content on the display screen ;

and a virtual bezel region along one or more edges of the display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to a second set of touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of the content on the display screen ;

and non-transitory memory storing a gestural software application in communication with the display screen , the gestural software application configured to produce the second mode of response in the virtual bezel region , wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region of the display screen .
EP2375314A1
CLAIM 9
An electronic device comprising : a touch-sensitive device ;
a processor configured to : identify a value of at least one parameter of the portable electronic device ;
modify a touch threshold based on the at least one parameter to yield a modified touch threshold ;
detect a touch on the touch-sensitive device ;
perform a first function (touchscreen layer) when the touch meets the modified touch threshold .

EP2375314A1
CLAIM 11
The electronic device of claim 9 , wherein the processor is further configured to modify a second touch threshold associated with a second function (first set) that is performed when the touch meets the second touch threshold .

US9645663B2
CLAIM 8
. The display system according to claim 1 , wherein an operating system status bar resides in the virtual bezel region , and wherein the user can toggle the status bar visibility (ambient parameter) using predefined set of gestures to toggle a full-screen mode .
EP2375314A1
CLAIM 4
The method of claim 1 , wherein the at least one parameter comprises at least one of a touch parameter , a function parameter , an ambient parameter (status bar visibility) , and a user parameter .

US9645663B2
CLAIM 14
. An electronic device comprising : a handheld interactive electronic device having a virtual bezel display screen , the virtual bezel display screen including : an active touchscreen region having a touchscreen layer (first function) with a first mode of response to touch-based inputs from a user of the electronic device , the active touchscreen region configured to display a first portion of the content on the virtual bezel display screen ;

and a virtual bezel region along one or more edges of the virtual bezel display screen and adjacent to the active touchscreen region , the virtual bezel region having a touchscreen layer with a second mode of response to touch-based inputs from a user of the electronic device , the virtual bezel region configured to display a second portion of content on the virtual bezel display screen ;

wherein the second mode of response is configured to selectively interpret touch-based inputs as intentional user input intended to affect the display of the first portion of the content on the active touchscreen region .
EP2375314A1
CLAIM 9
An electronic device comprising : a touch-sensitive device ;
a processor configured to : identify a value of at least one parameter of the portable electronic device ;
modify a touch threshold based on the at least one parameter to yield a modified touch threshold ;
detect a touch on the touch-sensitive device ;
perform a first function (touchscreen layer) when the touch meets the modified touch threshold .