Posts Tagged ‘embedded

Nano strengthens barriers to counterfeiting

By providing non‐reproducible technological features, nanotechnology based developments are expected to offer a significant move forward in preventing illicit copying intellectual properties and products. Ultimately, the implementation of the novel techniques will considerably reduce tax revenue losses through counterfeiting and improve citizens’ safety and quality of life.

Holograms, tamper‐evident closures, tags and markings and RFID labels are the most widely known anti‐counterfeiting technologies. The key limitation of these methods is that they can be copied. Innovations exploiting the intrinsic nature of nano materials to give items complex and unique ‘fingerprints’ results both in the development of new approaches and improvement of existing techniques.

Holography ‐ easily identifiable holograms, for example, showing the manufacturer’s logo are primarily used as first level identification devices. Two dimensional nano scale gratings, photopolymers and luminescent nano particles can be utilized to provide an additional level of security for the holograms.

Laser surface authentication ‐ a laser is used to examine the surface roughness of an object. Complexity and uniqueness of the surface roughness code is similar to iris scans and fingerprints. The advantages of the technique is that surface roughness at nanoscale cannot be replicated. Therefore,a much higher level of security is offered to products as compared to holograms and watermarks.

Radio frequency identification (RFID) ‐ is a form of automatic identification and data capture technology where data stored on a tag is transferred via a radio frequency link. An RFID reader is used to extract this data from tags. New developments exploit nanoscale variations, naturally produced during the manufacturing process of RFIDs that are unique to individual integrated circuits , which can be verified during data transfer. This is known as the Physically Uncloneable Function (PUF).

Nano barcodes ‐ three dimensional polymer patterns on the order of tens of nanometres can be made on silicon substrates to provide 3D nanoscale data encryption key, similar to barcodes. The advantages over conventional barcode/marking are difficulty of detecting presence (covert marking)and duplication. These can be applied to banknotes,security papers, art, jewellery and gemstones.

SERS and quantum dots tags – metal nano particles produce unique electromagnetic spectra (known as surface enhanced raman scattering) while certain semiconductor nano particles (known as quantum dots) have different fluorescence based on size and chemical composition. Both can be exploited as identification tools. They offer difficulty in reproducing due to infinite combinations, covert security feature, non‐toxicity and multi functionality. These nano scaled tags can be applied in inks, adhesives, laminates, paper, packaging, textiles, glass, and others.

Nano composite tags – consist of a materials‐based pattern (with magnetic and/or optical features) that forms part of a label, tag or embedded portion of an item. The nanometre sized magnetic and optical features are generated randomly during manufacturing, constituting a unique ‘fingerprint’ that is read and stored in a central database . The result is a secure identity for an individual item that is prohibitively expensive and difficult to copy. This technology can be applied in the pharmaceutical, spare parts, fashion and food and beverage industries. Incorporating encapsulated and functionalized (e.g. thermochromic) nano particles in labels is another promising solution based on the use of nano composites.

Tags : , , , , , , , , , , , , , , , , , , , , , , , , ,

Overcoming Barriers to Early Detection with Pervasive Computing

Embedded assessment leverages the capabilities of pervasive computing to advance early detection of health conditions. In this approach, technologies embedded in the home setting are used to establish personalized baselines against which later indices of health status can be compared. Our ethnographic and concept feedback studies suggest that adoption of such health technologies among end users will be increased if monitoring is woven into preventive and compensatory health applications, such that the integrated system provides value beyond assessment. We review health technology advances in the three areas of monitoring, compensation, and prevention. We then define embedded assessment in terms of these three components. The validation of pervasive computing systems for early detection involves unique challenges due to conflicts between the exploratory nature of these systems and the validation criteria of medical research audiences. We discuss an approach for demonstrating value that incorporates ethnographic observation and new ubiquitous computing tools for behavioral observation in naturalistic settings such as the home.

Leveraging synergies in these three areas holds promise for advancing detection of disease states. We believe this highly integrated approach will greatly increase adoption of home health technologies among end users and ease the transition of embedded health assessment prototypes from computing laboratories into medical research and practice. We derive our observations from a series of exploratory and qualitative studies on ubiquitous computing for health and well being. These studies, highlighted barriers to early detection in the clinical setting, concerns about home assessment technologies among end users, and values of target user groups related to prevention and detection. Observations from the studies are used to identify challenges that must be overcome  by  pervasive computing developers if ubiquitous computing systems are to gain wide acceptance for early detection of health  conditions.

The motivation driving research on pervasive home monitoring is that clinical diagnostic practices frequently fail to detect health problems in their early stages. Often, clinical testing is first conducted after the onset of a health problem when there is no data about an individual’s previous level of functioning. Subsequent clinical assessments are conducted periodically, often with no data other than self-report about functioning in between clinical visits. Self-report data on mundane or repetitive health-related behaviors has been repeatedly demonstrated as unreliable. Clinical diagnostics are also limited in ecological validity, not accounting for  functioning in the home and other daily environments. Another barrier to early detection is that age based norms used to detect  impairment may fail to capture significant decline among people whose premorbid functioning was far above average. Cultural differences have also been repeatedly shown to influence performance on standardized tests. Although early detection can cut costs in the long term, most practitioners are more accustomed to dealing with severe, late stage health issues than subclinical patterns that may or may not be markers for more serious problems. In our participatory design interviews, clinicians voiced concerns about false positives causing unwarranted patient concerns and additional demands on their time. Compounding the clinical barriers to early detection listed above are psychological and behavioral patterns among individuals contending with the possibility of illness. Our interviews highlighted denial, perceptual biases regarding variability of health states, over-confidence in recall and insight, preference for preventive and compensatory directives over pure assessment results, and a disinclination towards time consuming self-monitoring as barriers to early detection. Our ethnographic studies of households coping with cognitive decline revealed a  tension between a desire for forecasting of what illness might lie ahead and a counter current of denial. Almost all caregivers and patients wished that they had received an earlier diagnosis to guide treatment and lifestyle choices, but they also acknowledged that they had overlooked blatant warning signs until the occurrence of a catastrophic incident (e.g. a car accident). This lag between  awareness and actual decline caused them to miss out on the critical window for initiation of treatments and planning that could have had a major impact on independence and quality of life. Ethnography and concept feedback participants attributed this denial in part to a fear of being diagnosed with a disease for which there is no cure. They also worried about the effect of this data on  insurers and other outside parties. Participants in the three cohorts included in our studies (boomers, healthy older adults, and older adults coping with illness themselves or in a spouse) were much more interested in, and less conflicted about, preventive and compensatory directives than pure assessment.

Perceptual biases also appear to impede traditional assessment and self monitoring. Ethnography participants reported consistently overestimating functioning before a catastrophic event and appeared, during the interview, to consistently underestimate functioning following detection of cognitive impairment Additionally, we observed probable over-confidence among healthy adults in their ability to recall behaviors and analyze their relationship to both environmental factors and well being. This confidence in recall and insight seemed exaggerated given findings that recall of frequent events is generally poor. As a result of these health perceptions, many of those interviewed felt that the time and discipline required for journaling (e.g. of eating, sleeping, mood, etc.) outweighed the benefits. Additionally, they expressed wariness of confronting or being reprimanded about what is already obvious
to them. They would prefer to lead investigations and develop strategies for improving their lives. Pervasive computing systems may enable this type of integrated, contextualized inquiry if they can also overcome the clinical and individual barriers that might otherwise impede adoption of the new technologies.

Tags : , , , ,

Watermarking and Fingerprinting in the Transcoding Workflow

When you add the enormous accumulation of video content that resides inthe archives of media organizations to all of the new content that’s constantly being produced, you realize that sheer volume poses a significant challenge to utilizing both watermarking and fingerprinting technologies. At the very minimum, implementing an anti-piracy solution around watermarking requires the following capabilities:

For watermarking, consider whether to handle detection in-house orwhether to employ an outside service. The choice depends on how the watermark will be used. For example, to trace a single video back to its source should it be leaked, detection could be easily handled in-house. If your preference is to let someone else hunt down leaks, perhaps because they could come from a large number of would-be pirates, consider an external detection service.

Fingerprinting requires the following capabilities:

  1. Method to generate fingerprints
  2. Database to store metadata relating fingerprints to originals
  3. Third-party service (or multiple services) that tracks fingerprints and provides access control information

One of the problems with current watermarking and fingerprinting technologiesis that they only accept a limited number of input formats. And, in the case of watermarking, they only generate a limited set of output formats.  Plugging a particular watermarking or fingerprinting technology into Carbon Coder allows you to handle any media type. Carbon Coder can be run as a stand-alone transcoding engine or aspart of a larger transcoding farm for higher volume workflows.

Transcoding a media file from one video format to another involves a number of steps:

For the fastest execution, all of this occurs on-the-fly, in memory. In watermarking, the watermarking filter is plugged into the Carbon Coder pipeline and is applied during the transform stage. The result is a media file, in whatever format is required for distribution.

In fingerprinting, the fingerprint technology is embedded into Carbon Coder as an exporter. It analyzes the uncompressed audio and video frames and generates a fingerprint file, which can be used to recognizethe original media, in whatever format it is found. (There are no transform, encode or multiplex steps, because the output is only a fingerprint; not a media file.)

Tags : , , , , , , , , , , , ,

Desirable Quantum Key Distribution Attributes

Broadly stated, QKD(Quantum Key Distribution) offers a technique for coming to agreement upon a shared random sequence of bits within two distinct devices, with a very low probability that other devices(eavesdroppers) will be able to make successful inferences as to those bits’ values. In specific practice, such sequences are then used as secret keys for encoding and decoding messages between the two devices. Viewed in this light, QKD is quite clearly a key distribution technique, and one can rate QKD’s strengths against a number of important goals for key distribution, as summarized in the following paragraphs.

Confidentiality of Keys : Confidentiality is the main reason for interest in QKD. Public key systems suffer from an ongoing uncertainty that decryption is mathematically intractable. Thus key agreement primitives widely used in today’s Internet security architecture, e.g., Diffie-Hellman, may perhaps be broken at some point in the future. This would not only hinder future ability to communicate but could reveal past traffic.Classic secret key systems have suffered from different problems, namely, insider threats and the logistical burden of distributing keying material. Assuming that QKD techniques are properly embedded into an overall secure system, they can provide automatic distribution of keys that may offer security superior to that of its competitors.

Authentication : QKD does not in itself provide authentication.Current strategies for authentication in QKD systems include prepositioning of secret keys at pairs of devices, to be used in hash-based authentication schemes, or hybrid QKD-public key techniques. Neither approach is entirely appealing. Prepositioned secret keys require some means of distributing these keys before QKD itself begins, e.g., by human courier,which may be costly and logistically challenging. Furthermore, this approach appears open to denial of service attacks in which an adversary forces a QKD system to exhaust its stockpile of key material, at which point it can no longer perform authentication. On the other hand, hybrid QKD-public key schemes inherit the possible vulnerabilities of public key systems to cracking via quantum computers or unexpectedadvances in mathematics.

Sufficiently Rapid Key Delivery : Key distribution systems must deliver keys fast enough so that encryption devices do not exhaust their supply of key bits. This is a race between the rate at which keying material is put into place and the rate at which it is consumed for encryption or decryption activities. Today’s QKD systems achieve on the order of 1,000 bits/second throughput for keying material, in realistic settings, and often run at much lower rates. This is unacceptably low if one uses these keys in certain ways, e.g., as one-time pads for high speed traffic flows. However it may well be acceptable if the keying material is used as input for less secure (but often secure enough) algorithms such as the Advanced Encryption Standard. Nonetheless, it is both desirable and possible togreatly improve upon the rates provided by today’s QKD technology.

Robustness : This has not traditionally been taken into account by the QKD community. However, since keying material is essential for secure communications, it is extremely important that the flow of keying material not be disrupted, whether by accident or by the deliberate acts of an adversary (i.e. by denial of service). Here QKD has provided a highly fragile service to date since QKD techniques have implicitly been employed along a single point-to-point link. If that link were disrupted,whether by active eavesdropping or indeed by fiber cut, all flow of keying material would cease. In our view a meshed QKD network is inherently far more robust than any single point-to-point link since it offers multiple paths for key distribution.

Distance- and Location-Independence : In the ideal world,any entity can agree upon keying material with any other(authorized) entity in the world. Rather remarkably, the Internet’s security architecture does offer this feature – any computer on the Internet can form a security association with any other, agreeing upon keys through the Internet IPsec protocols. This feature is notably lacking in QKD, which requires the two entities to have a direct and unencumbered path for photons between them, and which can only operate fora few tens of kilometers through fiber.

Resistance to Traffic Analysis : Adversaries may be able to perform useful traffic analysis on a key distribution system,e.g., a heavy flow of keying material between two points might reveal that a large volume of confidential information flows, or will flow, between them. It may thus be desirable to impede such analysis. Here QKD in general has had a rather weak approach since most setups have assumed dedicated, point-to-point QKD links between communicating entities which thus clearly lays out the underlying key distribution relationships.

 

Tags : , , , , , , , , , , , , , , , , , , , , , , , ,

SQLJ

Along with host language embedded SQL type applications, there are also embedded Java applications, better known as SQLJ programs. SQLJ is a method for accessing DB2 from a Java application that supports static execution. Again, the benefits of a static execution are reduced resource consumption, improved diagnostics, improved security, and greater repeatability of SQL performance due to static access paths and plans. Everything you need to get from the data is already in the package bound at bind time.

SQLJ provides performance benefits of static query execution by embedding SQL queries into Java applications. SQLJ still utilizes the JDBC driver to access data source and is the layer above JDBC. SQLJ translator is used to process SQLJ source files with the extension .sqlj. It translates .sqlj source files into .java files and an SQLJ serialized profile into a form of .ser file. The serialized file contains all the SQL statements in original SQLJ source file. The translated resulting .java file will contain calls to SQLJ run-time libraries in place of SQL statements. In order to bind the application statically to a DB2 database, you usethe DB2 profile customizer tool called db2 sqlj customize. The db2 sqlj customize connects and binds a package on the target database using the serialized profile.The package bound in the target database using db2 sqlj customize will contain sections which correspond to each SQL query in the serialized profile.

Commands associated with SQLJ:

1. sqlj:

sqlj is the translator that takes an embedded SQLJ program and creates a .ser file used for binding and a .java file that will also be compiled into byte code, as typical Java programs are compiled.

2. db2 sql jcustomize:

This command will take the .ser file from the sqlj step, connect to the database against which the application will be run, and bind four bind files for this application, all with different isolation levels.

3. db2 sql jbind:

This command can be used to rebind this application against other databases; for example, it can be used for moving the application from the test to the production database.

The following packages need to be imported for SQLJ:

import java.sql.*;

import sqlj.runtime.*;

import sqlj.runtime.ref.*;

Tags : , , , , , , , , , , , , , ,