Showing posts with label Pop Cultural. Show all posts
Showing posts with label Pop Cultural. Show all posts

Monday, January 19, 2009

Hawaii Episode of No Reservations

I was watching the Hawaii episode of Anthony Bourdain: No Reservations. There it was, Ono Restaurant. I was actually at a restaurant featured on the Travel Channel. I don't know if that is a good thing or bad. Nowadays, from what I'm reading in recent reviews, you can't just walk up and take a seat anymore, unless you show up at odd hours. 

When I visited Hawaii in 2001 (a little over a month after 9/11), it literally took me three days to find an authentic Hawaii restaurant on Oahu. The one that I found was Ono. When I was there, I talked to the owner or manager (someone of authority) and told him that I was seeking out real Hawaii food and had so far done so without luck until I heard about his place. He sat me down and paraded dish after dish in front of me. The experience was very similar to Bourdain's experience on his TV show. My favorite dish was laulau, of course. I think I gained 5 pounds that day, which I have yet to lose years later. OK, slight exaggeration, but I think I did over indulge. 

I mentioned Ono in my Hawaiian vacation journal. I added that entry to my blog in 2005 here. I have a less than flattering photo of the front of the restaurant which I will keep private. From what I've reading, it no longer matches what someone would find when they arrive there. Ono is still very small, but the term "hole-in-the-wall" doesn't seem to apply anymore. Much like Bourdain, I recommend making at stop of Ono Restaurant.

Tuesday, October 28, 2008

No on 8

I don't mean to turn my blog into a serious of videos and images, but this issue is important and needs to be considered. Please join me (and Dianne Feinstein) in voting no on prop 8!

Monday, April 28, 2008

GenPets, what are they?

Someone just sent me a link for this new product that is due on the shelves very soon! It's call Genpets.


Genpets ...Bioengineered buddies!


So, I wonder if they are editable? If so, you can donate them to a local food bank after they die and get a tax write-off for their amortized value.

Sunday, April 13, 2008

Identification of years

How many people actually know that there is no year zero on the common Western (Gregorian) calendar. This creates logical problems that are hard to deal with in the lay population. Most people assume an understanding and use of zero. I would even catch myself thinking in terms of having a year zero had I not known a little more about our calendar than the average bear. Bottom line, on our calendar, the year 1 BC is followed by 1 AD.

Just as there is no year zero, there is not zero century. Our 21st Century is 2001 to 2100. The last year of the 21st Century is 2100, not the first year. That's fairly counter-intuitive. This does force me to think when I talk about periods in the 16th through 19th Centuries. It is very easy to think that 18th Century is the same as the 1800's.

So, there is no year zero, no matter how confusing that ends up being. Until we choose another calendar system, this will be a contentious issue.

Another point to discuss is how to identify years. The most common method for years counting backward is B.C., and A.D. for our current era (years that count forward). These two abbreviations refer a previously accepted date for the birth of Jesus Christ. It is now commonly agreed that if Jesus Christ did exist, his birth was more likely between the period of 8 and 4B.C. This means the start year of our calendar is pretty arbitrary, as it is not associated with any particular event. Yet, we still use terms that directly reference Jesus' birth. Alternative terms that have been proposed are BCE (Before our Common Era) and CE (our Common Era). This turns the arbitrary date away from being Christian centric, but in a way, it still attempts to enforce that old world calendar on others. I see BCE/CE used more frequently these days, but I do not believe it will ever become the norm.

To accept the arbitrary nature of our calendar and to establish some Information Age standard, those Europeans have come up with a supposed standard ISO-8601. This document is meant to be an international standard, but isn't really in common use. The problem with ISO-8601 is that is renumbers the years that count backwards. 1BC becomes 0000 and 2BC becomes -0001. Unless every history book ever written is updated to this new attempt to renumber the years, I doubt ISO-8601 will ever be in common use by anyone other than software programmers.

Monday, March 31, 2008

Drug Pushers

I used to be in favor of allowing drug companies the right to directly advertise their products to consumers. However, the more time that goes by, the more I realize the misguided ideology of this line of thinking. History now bares witness to the facts that reveal several truths about this matter.

Pharmaceutical companies do not do nearly the research that they need to in order to determine the effectiveness of their drugs before they start selling them to customers. Also, when such research is not favorable, they delay the release of the information to the public in order to drive more sales. The most recent example of this is Vytorin (and its component Zetia). These drugs were proven to reduce bad cholesterol. However, a dangerous assumption was made that this inherently also reduced the risk of heart attacks. The fact is that the drug does not reduce the risk of heart attacks. The drug companies of Schering-Plough Corp. and Merck & Co. marketed this nearly useless drug for two years after they knew it did not work for the purpose it was intended, according to AP in their article.

Pharmaceutical companies spend hundreds of millions of dollars on marketing campaigns. This is taking money way from research and development. In my opinion, it is also likely the major reason that drug costs are raising drastically since the band on drug advertisements was lifted.

Advertising drugs directly to the public encourages self-diagnosis. People are trying to be their own doctors. Advertising, along with the establishment of the Internet has given hypochondriacism new life and even legitimacy. Self-diagnosis is very dangerous.

Given these reasons, I am now in favor of re-establishing the restrictions on advertising for proscription drugs. This will help reduce the chances that corporate greed will take advantage of Americans. It will help reduce the cost of drugs. It will help provide for more R&D funding into new treatments. And, it will help reduce dangerous hypochondriacism and self-diagnosis.

Friday, February 15, 2008

Pop goes Mensa

Every once in awhile, even the elite among us must venture into the realm of pop culture. MENSA, it turns out, is not exception, apparently. Their chairhuman just came up with his "Top ten smartest shows of all time (in no particular order)". Leave it to a MENSA member to list a top ten of anything in no particular order. Now, it must be made clear that MENSA in no way takes itself too seriously beyond the actual endeavor to find smart people. This is a group of people that freely laughs at themselves. So, in an effort to give closure to any readers of my article here, I ask forgiveness from both MENSA chairhuman Jim Werdell and Fancast for re-publishing their list, as follows (my commentary is in red):

1. M*A*S*H – It had smart repartee and was so much more than a comedy. Yeah, I'll watch its reruns when I'm bored and nothing else
is on.

2. Cosmos (with Carl Sagan) – Sagan was able to communicate something
extremely complicated to the layman and do it well, and that’s unusual for a
scientist at his level. It should be noted that Carl
Sagan became an outcast among his peers in the scientific community because of
his attempts to make science accessable to the everyone.

3. CSI -- The way they use science to solve their programs is intriguing to
viewers. Only if all the worlds problems could be
solved with a bit of science within an hour.

4. House – Again, it’s high level type of show; it’s the personality that
makes it a winner, plus it deals with science. I am
enjoying this show, but find I can't watch its reruns.

5. West Wing – you had to pay attention to stay up with it. The repartee was
fast and furious and you needed a fairly high level intelligence to keep up with
it. I did enjoy this show a lot. It would've
been nice if we really had a President like that. It's
impossible.

6. Boston Legal – It’s primarily because of the characters. The story lines
are okay, but the characters are incredible and the writers give them great
dialogue. I can watch this sometimes.

7. All in the Family – The show dealt with social issues before its time and
was on the forefront of trying to show people’s feelings, beliefs and the
complexities of personality, in both a serious and comedic way. This was an important show in its day. It's ironic that a
show with its social content couldn't be aired today even though we all think
things are better now. I think its more that we are happy with how
effectively we are hiding the underlining issues now.

8. Frasier – The repartee was sensational; the main characters were very
good. Even though they portrayed people who were likely of high intelligence,
they also showed their weaknesses. This is a great
show that I can enjoy watching over and over.

9. Mad About You – It’s a personal favorite, I loved the characters and the
back and forth. It was very smart. This was a good
show that went deep into human relationships. Sometimes a little too
deep.

10. Jeopardy – It’s about the only game show that really tries to test
people’s intelligence. There’s very little luck involved, and there are few game
shows like that. I don’t watch it all that much honestly, but from what I’ve
seen it tests more than knowledge, it tests intelligence too. It's fun at times, but isn't really about smarts; instead about
who can memorize the most information.

Saturday, December 29, 2007

Medical Myths and the Myths about those myths - Part 6 (woman decides the sex of her baby)

An old myth is that the woman decides the sex of her baby when she gets pregnant. Even today, some cultures still have this belief. So, what's the myth about this myth? When I was a child, I learned that it is the father that determines the sex of the baby. Of course, this is as big of a myth as believing the mother has control over this event.

The fact of the matter is that barring actual (and expensive) medical intervention, and for all practical purposes, the selection of a baby's sex is completely random. Neither the mother nor the father can make the selection through conscientious efforts.[1] There are many different and often bizarre myths surrounding conception, as an article at babycenter.com covers.[2]

Many different actions can be tried with to improve the chance of getting pregnant, but sex selection is still not directly in the hands of either parent. Granted, Y (male) sperm have been deemed weaker and can be more affected by a woman's pH balance more so than X (female) sperm, but more boys are born than girls on a world wide basis. Nature has already figured all this out for us, and if decision making has happened, it is a result of our species' evolution, and not preferences we conjure up in our own minds.

Thursday, December 27, 2007

Medical Myths and the Myths about those myths - Part 5 (reading in the dark is bad)

Another myth I have found to be false is the idea that reading in the dark, or otherwise straining your eyes can adversely affect your vision. According to a recent report, researchers have found no evidence that reading in dim light causes permanent eye damage. It can cause eye strain and temporarily decrease vision, but that subsides after rest.[1] Personally, I've never worried too much about what reading in the dark can do to my vision because it never made sense to me as to why it would have a permenant affect.

In fact, my own experience suggests that it is important to execise my eyes. Both my parents and my sister require glasses, having less than ideal vision. I've always tried to keep my eyes exercised. Until a few years ago, I never noticed any issues with my eye sight. Recently, I started noticing some imperfection in my sight, so I went into the eye doctor. Come to find out, vision in one of my eyes is 20/15, and it's 20/20 in the other. Apparently, my vision used to be in the 20/12 to 20/16 range. I was getting fussy about having lowly 20/20 vision. At times in the past, I doubted what I was doing. Now though, I at least have some sense of control regarding the health of my eyes.

Wednesday, December 26, 2007

Medical Myths and the Myths about those myths - Part 4 (hair and nails grow after death)

A myth I didn’t hear about until sometime in the past ten years is that fingernails and hair grow after death. However, I never believed this myth because I heard it in the context of history and how people believed vampires in the past. They used to dig up graves of recently dead people to look for signs of undeath. They apparently often found such signs in the form of what appeared to be hair and nail growth on a corpse. Please note that I personally am not 100% convinced this is the source of the myth because this directly contradicts a similar vampire evidence story (myth?) about people in the Dark Age believing bloated bodies of the recently dead was a sign of a vampire. Not to mention the fact that one of the beliefs about vampires is that if fed with blood, they return to the physical form from the point they become a vampire (meaning their hair can't grow). [How's that for talking on myths about myths? :-)]




The fact is, as I’ve heard from many difference sources by now, the body’s skin dries out sometime after a person dies. As it does so, it shrinks. The nails and hair remain in place, so it can appear as though they grow because they protrude farther out from the skin. That said, I would still guess it is possible the nails and hair do grow a tiny bit right after death, though they certainly do not continue to grow once all body functions stop.

Tuesday, December 25, 2007

Medical Myths and the Myths about those myths - Part 3 (cutting hair makes is grow back thicker)

"Myth: Shaved hair grows back faster, coarser and darker."


My own experience on this is that it is partially true. It’s not a complete myth. Studies have shown that shaving doesn’t affect hair in that way. Nor do I think the act of trimming hair causes it to grow back thicker and coarser. I do think that how the skin surrounding the hair is treated does have significant impact. I did my own quasi-scientific study on myself when I was an adolescent. In an attempt to grow more hair body hair (being an adolescent trying to speed up the maturity process), I rubbed the skin on my chest aggressively over a period of a few months or so. In the oft chance to see if what I was doing would work, I rubbed one side of my chest more often then the other side. Having seen no immediate hair growth, I stopped this practice. It was about a year later I noticed the results. As hair did start to grow in, it only grew in where I had rubbed my skin months before. Yes, the hair on my chest grew in with the pattern I used to rub my skin. Even today, the thickest portion of my chest hair still vaguely matches the initial pattern I establish during my adolescence.

Another example of the effect of skin treatment is more recent. Although I’m not rapidly balding, I’ve always had a high and somewhat asymmetric hair line at the corners on my forehead. A few years ago, I learned that DHT levels in the skin and body can influence hair growth. It can cause hair to grow in some parts of the body. It can also cause a receding hairline on a man’s forehead too. I started a passive search for products that could affect DHT levels in the skin. In that search, I discovered Nioxin. The brand makes a claim that its shampoo can wash away DHT from the scalp. I tried it. I didn’t see any immediate results. Eventually, I decided to just to use it up. After about six months or so, I kinda started noticing what appeared to be new tiny baby hairs at the edge of my hairline. I decided to try Nioxin out for another few months. I bought more and kept using it. Over the next year, I noticed the hair growing back even more. Now, we aren’t talking about thick new grass on the field. I did perceive the start of a recovery of what I have lost over the past decade.

After awhile, I mentioned to Allie that I noticed a difference. She didn’t believe me. Then one day, she’s looking at me while we are snuggling and all of a sudden she was all like, “Is this all new?” with an amazed look on her face as she ran her fingers along my hairline. At first, I thought she was just she was trying to stroke my ego, so I challenged the comment. But I know her. She’s not the girly type that does that sort of thing. She was being genuine. Since then, my hairline has evened out a bit, and continues to recover at a slow pace.

So, from my own person experiences I can say cutting hair doesn’t change it, but how you treat the skin surrounding the hair does definitely impact it, though it takes a long time.