Is Anti-Smoking Based on Science or Morality?

Tobacco smoking is currently seen by many as the scourge of society, an action of those wanting to slowly kill themselves. It is common perception that this idea is based solely on scientific evidence that has accumulated over the past 60 years. Yet the truth is, smoking has always attracted the wrath of purists. In the past, ‘public health’ measures were not enacted because of scientific evidence, but a sense of morality – alcohol was condemned and labelled a sinful activity because of moral sensitivity, and the same was true of tobacco. So the question is, is the attack on smoking today once again borne of ethical reasoning, or scientific rigour?

 

When Christopher Columbus reached Cuba in 1492 with Rodrigo de Jerez and Luis de Torres, his two men experimented with smoking the tobacco pipe; Columbus himself not only refrained but spoke against it, referring to Rodrigo and Luis as sinking to the level of “savages” for smoking. When they packed tobacco on their boat and returned to Europe, there was an immediate divide between those who loved it and those who hated it, even inspiring King James I to write ‘A Counter Blaste to Tobacco’.

 

In the 1600s parts of the world saw people actively punishing smokers. First-time ‘offenders’ in Russia were subjected to being whipped and having their noses slit before being sent to Siberia. If they were caught a second time, they were punished by death. Sultan Murad IV of Turkey castrated smokers, and 18 a day were executed. In China, smokers were decapitated.

 

Such activities did not spread to the UK or USA, but other restrictions existed. In 1900, Tennessee, North Dakota, Washington and Iowa banned the sale of cigarettes by law, and by the following year 43 American states were strongly opposed to smoking. In 1904 a woman in New York was sentenced to prison for smoking in the presence of her children, and a policeman arrested a woman smoking in her car, stating “You can’t do that on Fifth Avenue.” In 1907, businesses refused to employ smokers.

 

By 1917 the anti-smoking feelings were still strong, and the primary focus was on using children to stop people smoking. Doctors would tell smokers they would suffer from blindness, tuberculosis or “tobacco heart”. Like today, insurance companies and surgeons would enquire if their customers or patients smoked. The August 1917 issue of magazine ‘The Instructor’ was labelled “the annual anti-tobacco issue” and featured cartoons to demonise smoking, as well as articles stating: “One puff does not destroy the brain or heart; but it leaves a stain…until finally the brain loses its normality, and the victim is taken to the hospital for the insane or laid in a grave. One puff did not paralyse the young man in the wheel chair; but the many puffs that came as a result of the first puff, did.”

 

That run of anti-smoking lasted until 1927, in America at least, but none of our science of today had been collected by then, rather it was all based on a moral principle. Germany was producing its own anti-smoking campaign around that time, with the famous “The German woman does not smoke” posters, as well as public smoking bans. The 1950s was the decade that saw the creation of the now-famous studies by Sir Richard Doll linking smoking to lung cancer, and in this time were other researchers like Ernst Wynder, described as a fanatical anti-smoker. By focusing on smoking as a sole factor in a time when it had yet to be implicated in disease was perhaps a tip of the hat that the researchers wanted to find an association, as so many scientists strived to do at this moment in history. In light of the findings, it was mentioned that 10% of smokers may contract lung cancer. That figure has been dropped in more recent decades although it still remains true.

 

Things progressed again in the 1970s with what has become known as the Godber Blueprint. Sir George Godber was a WHO representative who spoke openly of the “elimination of cigarette smoking” with comments such as “Need there really be any difficulty about prohibiting smoking in more public places? The nicotine addicts would be petulant for a while, but why should we accord them any right to make the innocent suffer?” Godber laid out a plan to achieve that goal, much of which has come into effect, such as “major health agencies [should] join forces to create and produce anti-smoking material for mass media” and he said the following should happen: elimination of smoking cigarettes; include quit-smoking assistance in health insurance; create “a social environment in which smoking is unacceptable”, raise tobacco prices enough to discourage sales; ban all forms of tobacco advertising; and create committees of sophisticated politicians in every country to help pursue stated goals. Almost 20 years before the EPA’s report that second-hand smoke poses a threat to non-smokers, Godber was creating plans to convince people of that very thing.

 

With regards to second-hand smoke and the question of ‘morality or science?’, about 85% of the studies on secondhand smoke and lung cancer failed to find a significant relationship between the two. Of the remaining 15% most indicated a statistical positive relationship while some actually indicated a statistical negative, or protective, relationship.  The studies of course were all statistical epidemiology: not actual findings of cause and effect. Only 15% find an associated risk, and the average relative risk of those is only 1.17, which is categorised as statistically insignificant. Of the 85%, most are kept out of sight, the most famous probably being the study conducted by the WHO, the largest study performed on second-hand smoke and which was hidden by the organisation because its findings showed no ill-effect of secondhand smoke. Enstrom and Kabat also conducted a large study, for 39 years, into passive smoking. It was commissioned by the American Cancer Society and was funded by the Tobacco-Related Disease Research Program. When the preliminary data arrived and showed no harm was posed from passive smoking, the funding was pulled. This led the researchers no choice but to accept funding from the tobacco industry-funded Center for Indoor Air Research, although the results remain unchanged from what was discovered when the TRPRP funded it.

 

Recently there have been suggestions or enacting of outdoor bans, with Milton Keynes almost having one and New York establishing one, despite no evidence to suggest that they benefit health of non-smokers. Indeed, anti-smokers today openly talk of keeping smokers out of sight and “denormalising smoking”. Although the difference today with the past is that there are now many vested interests with financial gains to be sought from the prohibition of tobacco, the similarity remains that much of the hysteria is based on a moral disagreement with the act. If the science is lacking, as it is on passive smoking, but bans are still in place and studies showing ‘undesirable’ results are hidden while those who do not agree with the literature are to be accused of being in the pocket of Big Tobacco, the scientific credibility is thrown into disrepute, and we are left wondering if those behind the numbers harbour similar feelings to Columbus himself.

 

 

The OxyContin Cha Cha

This is a guest post by Andrew Phillips

The dangers of OxyContin were known in the late 1990s and between the years 1999 and 2003 there had been between a 4 and 5 fold increase in deaths where OxyContin had been detected in the blood stream. By now many people are aware of the fact that the government will be taking Oxycontin out of pharmacies across the country. Ontario will be delisting the painkiller as well. However, down in the Maritimes no plan is in place to fund either OxyContin or its replacement OxyNeo. Saskatchewan is also not planning on funding OxyContin either.

Discussions to delist the drug started about the time Purdue Pharma sent notices that the company was replacing OxyContin with OxyNEO, which was approved by Health Canada on Aug. 22, 2011. But OxyNeo is exactly the same thing as OxyContin; in fact the only difference appears it’ll be harder to crush and snort; same stuff different name. But why is it being taken off the market now when what it has been doing has been known for so long? Easy – OxyContin is about to go off patent in 2013.

As to Health Canada I suggest you read that article especially the section about conflict of interest and the fast-tracking of drug approval and question the approval of OxyNEO. But perhaps the worst aspect of this partnership is Health Canada’s failure to enforce the rules against Direct to Consumer Advertising of prescription drugs in Canada, ads which use fear to drive patients into doctors’ offices to demand the most expensive new drugs that may or may not help them.

To understand the inherent danger of DTCA this article goes in depth into how it works. Another interesting thing is recently the Supreme Court ruled that ISPs aren’t bound by the Broadcast Act with one of the countries biggest ISPs – Bellmedia – now owning CTV, CTV2, and many radio and speciality channels. Will they use this as an end around to run even more DTCA drug ads in Canada? Ads for Champix and Gardasil are showing up on Canadian TV now and it is possible that we can expect to see more DTCA in the near future.

An interesting sidebar to this is Health Canada is in charge of the Consumer Product Protect Act which, considering what they’re doing – or not doing – now, makes you wonder what is the real reason for it in the first place. You can read about it here and here. Quite frankly it appears to be another euphemistically named law, much like the Protecting Children from Internet Predators Act that Vic Toews, it now appears they didn’t bother to read too carefully that actually curtails civil liberties and property rights. So while they’re are working on the “Surveillance Bill” they should pull that one out as well.

Of all the news articles in Canada not one of them has mentioned the going off patent angle. Not one of them has mentioned just how curious it is that while one is being pulled early a replacement is already available. The farcical assertion that drug addicts will be stopped by a pill that is a little tougher to crack is negated by new extraction techniques that are already being discussed by addicts and this little nugget goes along way to explaining the timing, “…the company is positioning itself to avoid having its product deemed interchangeable with lower cost alternatives that will be brought to market once OxyContin® loses its exclusive patent”. Ultimately that new extraction technique might just mean buying a bigger hammer. They’re drug addicts, they’re not stupid.

Sample Shots From Nokia 808 PureView – With Zoom

Yesterday Nokia announced the 808, featuring a 41megapixel camera with PureView. Today we have sample photos of what the phone can do, which aside from stunning pictures is the ability to zoom in to unprecedented levels to see detail invisible to the human eye when in that real-life position. Below are two pictures taken with the new device – for a glimpse of the technology on offer, zoom in on the rock climber’s foot in the photo on the right.

 

 

Nokia Unveil the 808: 41MP camera with Pure View and Xenon

When Nokia unveiled the N8 in 2010 it set the bar for imaging in mobile devices. Not for the first time either – Nokia phones have allowed users to take high quality pictures for a long time now, and at the time the N8 was released Apple still hadn’t worked out how to include a flash for low-light situations and HTC cameras were all but a joke to both the industry and the public. While the competition has improved remarkably in the past 12 months – with iPhones and HTCs taking remarkably good pictures now – the N8 has remained the undisputed image king, thanks in large part to the sensor it packed, allowing more light than any other phone on the market.

Today, however, sees the N8 dethroned. It wasn’t the competition that knocked it off its top spot, but Nokia itself with the Nokia 808, the much rumoured and long awaited successor to the N8. A year ago Nokia CEO Stephen Elop announced that the image quality and capabilities of the N8 were the tip of the iceberg of what Nokia was capable of and testing in its labs, and that piqued interest. According to Nokia:

“PureView imaging technology is the result of many years of research and  development and the tangible fruits of this work are amazing image quality, lossless zoom,  and superior low light performance…

…One of the reasons the Nokia 808 PureView has taken so long to develop is down to processing power.

We simply couldn’t get hold of enough. Even the most powerful mobile chipsets have an upper limit of
around 20Mpix image processing capability. The Nokia 808 PureView eats up more than double that.
For video, the amount of pixels handled through the processing chain is staggering — over 1 billion
pixels per second, and 16x oversampling. That’s a throughput of pixels 16 times greater than many
other smartphones.”

 

The Nokia N8 had a sensor of 1/1.83, making it the largest ever to appear in a mobile phone and even many point and shoot camera. To drop jaws around the world, the 808 has a sensor 2.5 times larger than that of the N8, and the device packs Xenon flash for snapping photos, and LED for recording videos, offering users the best in both worlds.

For full details and information of why this is important and why it has leapfrogged over the competition without any hope of being caught for many years, Nokia’s own .pdf can be read here, and excerpts can be read below. Before that though, let’s take a look at some sample photos taken from Nokia’s Flickr account

 

 

PureView Pro imaging technology doesn’t represent a step change for camera smartphones performance, so much as a quantum leap forward. The first device to feature Nokia PureView Pro camera technology is the Nokia 808 PureView, which gives people the  means to take better images and video footage  than ever before.  Nokia PureView Pro turns conventional thinking  on its head. It dispenses with the usual scaling/ interpolation model of digital zoom used in  virtually all smartphones, as well as optical zoom  used in most digital cameras, where a series of  lens elements moves back and forth to vary the  magnification and field of view. Instead, we’ve taken a completely new road.

The result?


Unprecedented camera control and versatility, combined with truly spectacular-quality images and  video. Nokia 808 PureView sets new industry standards — it will give you around 3x lossless zoom for  stills, and 4x zoom in full HD 1080p. For 720p HD video, you’re looking at 6x lossless zoom.  And for nHD (640×360) video, an amazing 12x zoom!

 

Always true to the image

With the Nokia N8, we limited the digital zoom to just 2x to avoid too much compromise to image
quality. But at the end of the day, this was still a conventional digital zoom. With the Nokia 808 PureView,
zoom is handled completely differently — like nothing that has gone before. We’ve taken the radical
decision not to use any upscaling whatsoever. There isn’t even a setting for it.

When you zoom with the Nokia 808 PureView, in effect you are just selecting the relevant area of the
sensor. So with no zoom, the full area of the sensor corresponding to the aspect ratio is used. The limit
of the zoom (regardless of the resolution setting for stills or video) is reached when the selected output
resolution becomes the same as the input resolution .

For example, with the default setting of 5Mpix (3072 x 1728), once the area of the sensor reaches
3072 x 1728, you’ve hit the zoom limit. This means the zoom is always true to the image you want.

New depth, new detail

The way Nokia PureView Pro zoom works gives you many benefits. But the main one is undoubtedly
‘pixel oversampling’.

Pixel oversampling combines many pixels to create a single (super) pixel. When this happens, you keep
virtually all the detail, but filter away visual noise from the image. The speckled, grainy look you tend to
get in low-lighting conditions is greatly reduced. And in good light, visual noise is virtually non-existent.
Which means the images you can take are more natural and beautiful than ever. They are purer, perhaps
a more accurate representation of the original subject than has ever been achieved before.

 

Less is more.

The simple structure of Nokia PureView Pro beats more complicated designs hands down. Image
definition is pin sharp, way superior to conventional zoom designs. Conventional designs need many
more lens elements to provide the zoom capability and correct aberrations, but these interfere with
definition and/or light transmission. Our simple structure has enabled a significant improvement in
manufacturing precision, and our lenses are produced with 10x greater precision than SLR lenses.
This was essential to allow the PureView Pro sensor and optics to work in complete synergy.
Neat and compact.
The size of the Nokia 808 PureView camera (including sensor and optics) is at least 50%-70% smaller
than a conventional optical zoom design

Effective zoom settings.

You can get right up close with any zoom setting. Typically, optical zoom gets closest with wide
(rather than tele) lens settings. Which means you have to stand physically closer to whatever you’re
shooting, obscuring the light and possibly casting unwanted shadows. With the Nokia 808 PureView,
you can use full zoom capability at a shooting distance of 15cm providing greater than ever
magnification of small objects with full zoom.

On a more technical note…

oversampling eliminates Bayer pattern problems. For example, conventional 8MPix sensors include only
4Mpix green, 2Mpix red and 2Mpix blue pixels, which are interpolated to 8Mpix R, G, B image. With pixel
oversampling, all pixels become true R, G, and B pixels. What’s more, based on Nyqvist theorem, you
actually need oversampling for good performance. For example, audio needs to be sampled at 44 kHz
to get good 22 kHz quality.

 

Quality, not quantity

People will inevitably home in on the number of pixels the Nokia 808 PureView packs, but they’re
missing the point. The ‘big deal’ is how they’re used. At Nokia, our focus has always been capability
and performance.
The main way to build smaller cameras over the years has been to reduce the pixel size. These have
shrunk just over the past 6 years from 2.2 microns, to 1.75 microns, to 1.4 microns (which is where
most compact digital cameras and smartphones are today). Some new products are on the way with
1.1 micron pixels. But here’s the problem. The smaller the pixel, the less photons each pixel is able to
collect. Less photons, less image quality. There’s also more visual noise in images/videos, and various
other knock on effects. In our experience, when new, smaller pixel size sensors are first released, they
tend to be worse than the previous generation. While others jump in, banking on pixel numbers instead
of performance, we prefer to skip early iterations.

Lessons learned

With the 12Mpix Nokia N8, for example, we were more concerned with capturing photons of light than
ramping up the number of megapixels. We bucked the trend and went with a large sensor and 1.75
micron pixels — but the result was a new benchmark in image and video quality. This set the Nokia N8
apart at the time, and competitors are still trying to match it two years later. The Nokia PureView Pro
comes is equipped with an even larger sensor, 1/1.2” approximately 2.5 larger than the sensor used
in the Nokia N8. The result is an even larger area to collect photons of light. With PureView we’re
continuing to make choices focused on performance rather than pixels for pixels’ sake. Fewer but
better pixels can provide not just better image and video quality, but better overall user experience
and system capability.
In fact, 5Mpix-6Mpix is more than enough for viewing images on PC, TV, online or smartphones.
After all, how often do we print images bigger than even A4?

Fire Safe Cigarettes Hitting Europe?

Although there has been talk of fire safe cigarettes reaching the UK for the last seven years, the measure has never come to pass. Now, though, the London Fire Brigade is reporting on its website that the European Commission has now agreed on a safety standard for cigarettes and the EU is expected to start selling reduced ignition propensity (RIP) cigarettes (otherwise known as fire safe cigarettes (FSC)) from November, 2011. The measure is defined as “voluntary” but if manufacturers do not comply their products can be removed from the market.  (As of today, the measure has not yet come into force, but is still on the cards.)

Traditional cigarettes, currently still on sale within the EU, stay lit until reaching the butt 99% of the time, while fire safe cigarettes, currently on sale in the USA, Canada and Australia, retain ignition only 10% of the time unless the smoker puffs to keep the cigarette alight. These cigarettes work by adding two or three bands of paper along the length of the cigarette, which reduces the oxygen flow and thus causes the cigarette to go out if not puffed at that moment. There are, however, questions over the safety of these new cigarettes.

The state of New York mandated fire safe cigarettes in June, 2004 and in January, 2005 the Harvard School of Public Health published its study on them. The authors tested nineteen compounds in the cigarettes and all nineteen had higher levels of toxicity than their non-FSC counterparts, with carbon monoxide levels being higher by 11.4% and naphthalene 13.9%. Naphthalene can cause myriad side effects if one is exposed to enough of it, such as anaemia, convulsions, vomiting and even comas. In addition to the increased levels of toxic compounds, the bands of paper are glued together with an ethylene-vinyl acetate, copolymer emulsion based adhesive, which is used as carpet glue or the tube used in a glue gun. There is evidence, therefore, that fire safe cigarettes contain higher levels of toxic compounds than ‘normal’ cigarettes, and in a time when much focus is given to the ingredients in cigarettes, should lawmakers not be aiming to reduce these levels rather than increasing them?

Away from laboratory testing and into real world cases, the Internet is awash with cases of smokers who since smoking fire safe cigarettes have suffered from various health problems, which promptly ceased when they switched back to non-fire safe cigarettes or rolled their own. Such is the extent of the problem, in fact, that a petition to remove these cigarettes from the market now has over 27,000 signatures, many of whom state health complaints from the cigarettes.

It has long been known that traditional cigarettes contain an accelerant to keep the cigarette burning. From a business perspective it makes perfect sense: if the cigarette burns faster, the smoker is more likely to consume more and thus purchase more. Rolling papers do not contain this accelerant and are well known for extinguishing regularly, causing the smoker to relight it. One of the most frequently mentioned facts about cigarettes is that they contain over 4,000 chemicals, and so the question is why not simply remove the accelerant to have the same effect as RIP cigarettes, rather than add more chemicals and increase the toxicity of those already present? In The Medical Journal of Australia in 2004, Simon Chapman wrote that “The elimination of citrate and other burning agents in cigarette paper thus appears to be a simple and effective means of dramatically reducing the ignition propensity of cigarettes.”

A final point to consider is the overall effectiveness of fire safe cigarettes. New York’s statistics on smoking-related fires show that there were 5.36 fires per 10,000 smokers in the four years preceding 2004 and 5.69 in the four years afterwards. In 2008, however, there were 6.37 fires per 10,000 smokers, meaning that the number of smoking-related fires have actually increased since the introduction of RIP cigarettes, which is even more troubling when one considers the smoking rate dropped from 21.6% in 2003 to 16.8% in 2008.

Unlike the heated discussions surrounding other smoking-related legislation like plain packaging, display bans and graphic warnings being based upon the likelihood of success and government interference in personal choice, the debate on fire safe cigarettes hinges on the safety of the product – according to the reports of those suffering from them, the risks of smoking have changed from an increased risk of disease later in life to an immediate impact on health. While no one would argue against cigarettes that do indeed lower the chances of fires (or are safer in any other way), the issue in question here is whether fire safe cigarettes are really the answer they are presented as.

 

*This entry first appeared at www.smokescreens.org, October 2011, and has been slightly modified here*

Film Review: “The Woman In Black”

The Woman In Black has received a lot of attention since its announcement, partly due to it being a well-known story from its book and stage production, and partly because all eyes are on Daniel Radcliffe.

The film is a big departure from Daniel, who will forever be known as Harry Potter, no matter how illustrious his career goes on to be. Focusing on the actor for a moment, there can be little criticism of his performance. From his appearance to his character portrayal, he performed wonderfully. The drawback to casting Radcliffe however was not in his performance, but that the public has known him for such a long time as a young person, thanks to Harry Potter, that it’s hard to shake that from your mind when watching The Woman In Black. Which means that it’s difficult to disengage the actor from his previous roles to view him objectively in this film. Put another way, we feel like we know Radcliffe, and that offers a sense of comfort when watching him perform in a horror film – and it removes quite a large element of the suspense from the film because we’re just too comfortable with the lead actor as someone not to feel scared by. And because the film is almost voyeuristic in that you feel as though you are accompanying Radcliffe on his journey, there is a peculiar sense of protection throughout.

All that could be overlooked and indeed overcome if the writing was stellar, but sadly it wasn’t. With a 12A certificate The Woman In Black was never going to be a terrifying, white-knuckle ride that kept viewers on the edge of their seat. It gets off to a slow start, where we learn of Radcliffe’s sad personal problems before he embarks on a journey from London to England’s North East as part of his job as a lawyer. His task is to get the paperwork of a deceased widow in order so her house can be sold off. Once we see the house, the film starts to rely too heavily on tired horror cliches to shock an audience that is desensitised to such attempts. Scares and moments of suspense were thin on the ground, but they were almost entirely revoked thanks to obvious camera movements (who doesn’t expect something to happen when the actor is to one side of the screen while what’s behind him is in full view, or when there’s a close-up and the camera then pulls back?) or creepy music acting as a big neon sign warning of an impending moment for which to prepare. For a widely publicised film with a huge star in the lead role, not to mention the film’s history as a play and a book, there could and should have been more substance. With such a vast back-catalogue of haunted house films and stories from which to draw on, The Woman In Black could have been one of the scariest films to be released, but instead it found itself with an identity crisis, stuck in no-man’s land somewhere between horror and drama, not quite knowing where to lay its loyalties and eventually deciding on neither. You know you should be scared, but are left wondering why you weren’t. And for a 90-minute film, you’re also left wondering why it took so long for the story to get going. A slow start would be fine had it been a longer film, or picked up to a flurry of activity that left viewers hiding behind their hands from the ceaseless onslaught of scares and suspense, but instead when the lights come on there’s nothing to think but how anti-climatic it transpired to be. Throughout the film, rather than gasps and screams, the most common reaction from the audience was laughter – not the reaction horror filmmakers tend to go for, and a rather clear indication of how scary this film turned out to be.

Perhaps it was the inclusion of Radcliffe that encouraged the writers to keep the film tame – with his fan base predominately comprised of the younger members of society, and his reputation mostly as a child star, it may have been considered too big a leap to enter a bonafide, adult horror film, and instead a deliberate attempt was made to achieve a 12A rating. It’s pure speculation, but if it’s true, it was a mistake. A film like The Woman In Black deserves a genuinely terrifying script with a more experienced director behind the camera to ensure that big scares occur and when they do, they aren’t pre-empted by camera placement or music. The success of the Paranormal Activity trilogy should be a lesson to all horror filmmakers: less is more. An audience is more scared when something happens out of the blue, with nothing to prepare them for the scare; and never fail to make the final thirty minutes a hive of scares. The Woman In Black deserved that treatment, but it didn’t get it.

Rating: 2/5