One thing that tends not to happen too much in science fiction (or in fantasy, where it's less overt but still an issue) is any kind of consideration of how technology shapes morality. I'm not talking absolutes here, but generalities - the kind of consensus morality that most people in any given society can agree on. If a society doesn't have that consensus, it's not going to last: without a consensus morality that covers a good chunk of what happens in public life (and usually private as well) people tend to revert to tribal mores, finding themselves groups that share their views and identifying with their chosen 'tribe' over everything else. I don't think I've seen the phenomenon described that way, but you can see it in descriptions of the dying Roman Empire, and in the long and painful decline of the Eastern Roman/Byzantine Empire. Once too many (and I have no idea where the tipping point is) people started thinking of themselves as something other than Roman - the reason they did this is a different beast altogether - the Empire was doomed.
That's actually not what I'm talking about here, but it is incidental to my point. One of the things that happens when you get a collapsing empire is that technology levels in the general populace drop. In part this is because it takes a large, well educated population to maintain and improve technology. Plus, technologists are specialists of a kind you tend to only find at high levels of technology: a society has to have a high food production level and a sophisticated trade and distribution network to maintain people whose principle function is basically making stuff that makes life easier. Once you knock a technological society back to pre-industrial or even to subsistence farming levels, that society loses the ability to maintain the people who build and run the tech.
The other big reason tech levels drop is the one I'm interested in. Morality. The consensus morality always lags behind technology by at least a generation. Sometimes it's a lot more - but it never freezes. If society is fragmenting, it's much more comfortable to drop back to the technology level that your morality can accept, at least at first. We tend not to see the massively interconnected frameworks that make up the current technological basis, and most of us don't have the luxury of choosing which technology we're going to accept. The Amish, who do, only have that luxury because they're protected by the much larger body of the society around them.
Now... Imagine a society where no-one carries any harmful recessive gene markers. Hemophilia and a host of other inheritable nasties no longer exist. Leaving aside the question of how those people got to that state (hint: probably not by gene surgery), would those people regard (consensual, between adults) incest as taboo?
I think after they'd been that way long enough, they wouldn't, but the shift would be a long and painful one, possibly involving ugly protests and persecution. It would make an interesting story - but it's not one anyone could write and have published today, because almost everyone today regards incest with abhorrence.
To add a new twist, what new taboos might have arisen? To put this in context, we have taboos and rituals today that were either impossible or unthinkable a few hundred years ago. Sometimes both. Take hand-washing. It's not possible to wash your hands regularly without at minimum a reliable supply of clean water. In Western societies today we take this for granted. Maybe in future societies they won't because everyone will carry pathogen-eating nanobots. Or maybe they'll have some kind of bug-zapper that you walk through to instantly sterilize you and whatever you're carrying (this could cause problems in the future version of programming, since it will make sacrificing goats to one's computer rather more awkward. But anyway...). Possibly your future person wouldn't dream of going anywhere without having gone through the zapper first. Or will think that anyone who carries a child through pregnancy is insane when bio-wombs are so much more convenient and protect the growing embryo so much better than the old, messy, biological method.
Of course, there will be controversies. People may well be arguing that it's immoral to have a child whose genetic makeup is entirely derived from you (that is, two eggs or two sperm are merged to produce a unique individual who has only one biological parent). Maybe the big argument will be over whether or not technological artifacts can be considered people - and whether it's murder if someone destroys an AI. Perhaps the big moral issue of the day will be who actually 'owns' vat-grown meat and organs - is it the source of the DNA? Or the owner of the vat? What if there's a dispute over the ownership of the replacement parts that were grown for you after you wiped out a lung, your liver, and a hefty chunk of assorted other internal organs in an accident? Will you be the 'villain' for your foolishness in risking your life, or will the company that wants to repossess the organs be the 'villain'?
One of the aspects of Sarah's DarkShip Thieves that I really enjoyed was the moral conflicts she slipped in under the main storyline. Most of them aren't things that are much of an issue today - if they're even possible. Some of them were - to my delight - notions I'd never considered. There's a lot of that in Heinlein, too, and in Dave's writing. With Pratchett the moral questions tend to be masked somewhat with the fantasy setting, but not always (the question of dwarf gender comes to mind).
And now, so inhabitants of the future can read and laugh at how silly we were back in that primitive era of networking (or alternatively, so it can all be vanish without trace when the last Google server rusts out in the new dark age), what do you think will be taboo in 50 years time? 100 years? 500? How about way, way in the far future?