425 results found displaying 13-15
Smartphone 'Handover' Screenlock Device Printer Friendly Version
When smart devices, particularly smart phones, with touch screens are handed from one user to another user with the intent of sharing content as displayed on the display of the device, often, for touch screen devices, the new user inadvertently touches the smart screen when receiving the device from the first user, and in the process the desired content is thus not shown on the display because the touch has enabled an undesired touch function inadvertently.
- When using a smartphone a common desire for a user is to show another person a photo.
- Often when transferring the phone to the second user there is an accidental touch of the touch screen by either the first or second user
- Often the inadvertent touch activates a function that removes the desired photo from the display
- Then there is a tedious process where the original user has to retrieve the phone and repeat the process
The core of the invention is a capability to freeze the touch screen for the handover of the device. In this manner there is no inadvertent touch event during the handover. Freezing of the touch screen can mean:
- Total lockdown of the touch screen so that no touch events are interpreted by the device OS
- Or, partial lockdown of the touchscreen, so that, for example, the zone around the outside perimeter of the touchscreen does not register touch events, since it is this outside region that is most likely to experience inadvertent touches
There are two key ways in which freezing of the touchscreen can be implemented
- Automatically, where sensors in the device are used to monitor and detect a handover of a device from a first to a second user
- Manually, where the first user implements an input that freezes the device for the handover function
- The freezing of the device can be for a fixed period, designed so that the handover can be effected without issues. This fixed period can be a set time or user-defined,
- Or for the whole period of time whilst the second user has the device
- Or until either the first or second user implements a user input to unfreeze the device
Automatic detection of the first and second handover events can include
- Use of gyrometer sensors or other sensors that determine spatial position of the device
- Use of camera sensors, front and back, for example to detect the position of the device with respect to the hand(s) and/or face(s) of the user(s)
- Use of any other sensors in the device and any combination thereof
Manual entering and/or leaving "handover" mode
- A button that is pressed
- A gesture that is made, possibly pre-recorded, and is used solely for this function
- Voice command
- Any other input possible into the device
- The "freeze" function may be used at all times, or only for certain apps which are more likely to contain content which is shared by users, eg photo apps
- The user may be able to define which apps in which the freeze function is working

Post a reply to this idea

Want to use this idea? Check out the Site Concept as well as the Site Rules.

Eye Strain Prevention Device Printer Friendly Version
Computer vision syndrome (CVS) is a temporary condition resulting from focusing the eyes on a computer display for protracted, uninterrupted periods of time. Some symptoms of CVS include headaches, blurred vision, neck pain, redness in the eyes, fatigue, eye strain, dry eyes, irritated eyes, double vision, polyopia, and difficulty refocusing the eyes. These symptoms can be further aggravated by improper lighting conditions (i.e. glare or bright overhead lighting) or air moving past the eyes (e.g. overhead vents, direct air from a fan).
Asthenopic symptoms in the eye are responsible for much of the morbidity in CVS. Proper rest to the eye and its muscles is recommended to relieve the associated eye strain. Various catch-phrases have been used to spread awareness about giving rest to the eyes while working on computers. A routinely recommended approach is to consciously blink the eyes every now and then (this helps replenish the tear film) and to look out the window to a distant object or to the sky - doing so provides rest to the ciliary muscles. One of the catch phrases is the "20-20-20 rule" every 20 mins, focus the eyes on an object 20 feet (6 meters) away for 20 seconds. This basically gives a convenient distance and time frame for a person to follow the advice from the optometrist and ophthalmologist. Otherwise, the patient is advised to close his/her eyes (which has a similar effect) for 20 seconds, at least every half hour.
The problem is that 20-20-20 rule is voluntary and many people do not address this rule at all or only intermittently. Additionally companies may have policies around this but these are hard to police and they are potentially liable for future damages if employees get CVS. What is needed is an enforced solution to the 20-20-20 rule in order to satisfy health and safety requirements.
The core of this invention is the inclusion of software code on any device where a user might get CVS. This software would be activated on the device for the required period, say 20 second once every 20 minutes. The function of the software is to force the user to focus their eyes ca. 20 feet away for the required period of time. This is enabled in one embodiment by displaying an image or other information on the display, inclusive, if required, of the normal information that would have been displayed on the screen, which is displayed on the screen in a manner which requires the user to focus on the information for a minimum period of time in order to continue using their device; this image of other information is structured such that it forces the user to focus their eyes as if it were 20 feet away An alternative mode would be to use the forward facing camera on the device to monitor the user's eyes during a downtime period such that the device is not usable unless the user has diverted their eyes from the device for the appropriate period of time. The simple version of the software would be incorporated on the device as an app, in the OS, in the firmware, or even a cloud based solution. The user or a corporate IT manager would set up the software with key parameters being the time between events, the time of events and the type of eye-focusing practice that is required. This could be a requirement for the user not to look at the screen for a period (monitored with the camera eg) or the inclusion of a de-focused images or a de-focused version of the usual information as would have been displayed on the screen.
A defocused image can be constructed in a number of ways including
(A) magic eye images. Magic Eye is a series of books published by N.E. Thing Enterprises (renamed in 1996 to Magic Eye Inc.). The books feature autostereograms (precisely, random dot autostereograms), which allow some people to see 3D images by focusing on 2D patterns. The viewer must diverge his or her eyes in order to see a hidden three-dimensional image within the pattern. "Magic Eye" has become something of a genericized trademark, often used to refer to autostereograms of any origin. The autostereogram predates the Magic Eye series by several years. Christopher Tyler created the first black-and-white autostereograms in 1979 with the assistance of computer programmer Maureen Clarke.
(B) An alternative to magic eye images is to use displays which have a 3-D operating mode in addition to the usual 2-D operating mode. These are now becoming common.
Whereas we have stated above the use of an image by the word image we mean any information as displayed on a display, eg. A picture, text, video or other. This can include the OS and application data that would have normally been displayed on the screen, but now in a de-focused mode.
The user would satisfy the requirements of the CVS software by either looking away from the screen for the desired time period, or alternatively focusing on the de-focusing image for a required period time.

Post a reply to this idea

Want to use this idea? Check out the Site Concept as well as the Site Rules.

New Automotive Reflector Printer Friendly Version
The invention is composed of two parts:
1. The first part is a modulated retro-reflector structure where current or charge is used to modulate say, a MEMS retro-reflector structure. These devices are known in the literature having being proposed solely for optical communications where they need to operate at very high frequencies. See, for example, http://en.wikipedia.org/wiki/Modulating_retro-reflector
2. The second part is a device that drives the modulating feature of the modulating retro-reflector. Typically this will be a MEMS structure that requires current or power modulated by a semiconductor device built into the structure or in a separate board. The power source can be either:
a. a low power long-life battery and a photodiode that is used to detect (1) nightime/dark conditions and (2) incoming incident light, such that the modulating feature is only used when necessary to conserve power, or,
b. more advantageously a built in photovoltaic solar cell that uses some of the incident light to drive the MEMS switching structure to enable modulation. This device can also include a rechargeable battery to store power during daylight periods.
This invention is made by a number of means:
1. The Modulating MEMS structure is made in a typical MEMS fab
2. The control electronics are made in various semico fabs and them fabricated into printed circuit boards - as per typical electronic devices
3. The whole lighting device is then packaged in a plastic housing, typically in Chine where this is done manually
The invention is used very simply where it replaces current static light reflectors.

Post a reply to this idea

Want to use this idea? Check out the Site Concept as well as the Site Rules.

Results are currently sorted by "newest". Click here to see the hottest ideas first.