What You Can Do About Workplace Discrimination Laws and Cases

Workplace Discrimination Cases: How You Can Know If It’s Happening To You and What You Can Do About It

There are so many forms of discrimination and harassment. And, there are many federal laws that forbid persons to discriminate and harass persons based on their color, race, national origin, religion, sex, disability, age, pregnancy, etc.

Local and state laws contain similar protections and can give protection in other circumstances as well. Many comprehensive laws tackle and forbid workplace discrimination and harassment. If you’re an employee and feel discriminated and/or harassed by your employers and/or your co-workers, what options do you have available?

1 – Talk To Your Employer About Your Feelings

A good start to deal with discrimination and harassment is to talk with your employer. Most of these acts tend to go unpunished because the victim doesn’t make it clearly known that the behavior is not unwelcome. It’s very rare that employers will openly admit discrimination and/or harassment and assist you in bringing legal papers against them. Your employer must comply with the law but you must ensure that your rights are protected.

2 – Inform Them About The Issues

It’s important that your employer knows that you’re serious about the matter. Be sure that a written report is made each time you report an incident. Ask that for an investigation into the matter and that corrective action is taken against the offender(s). Employers need to promptly look into all workplace discrimination and/or harassment reports.

How Can You Know What Actions Are Against The Discrimination Law

The law doesn’t prohibit all prejudiced actions. It only forbids discrimination based on a person’s status that’s protected under federal law such as:

- Age

- Color

- Disability

- National origin

- Race

- Religion

- Sex

- Union activity

That means if an employer decides to base his/her decision on race, they can legally be in trouble for discrimination. If a minority is paid less money than his/her counterparts due to race, the employer could be in trouble for discrimination because it violates Title VII. It’s not illegal for employers to pay low wages to one employee and not others if that employee is performing different tasks. The question is whether the dissimilarity in treatment is based upon the person’s protected status. When treatment is based on protected status, it’s known as intentional discrimination.

Title VII also forbids behavior that has the consequence of discriminating against people of a protected class even if the reason for the treatment difference is not on a protected class. For instance: an employer decides to hire just applications that don’t have custody of pre-school age children. When looked at thoroughly, the decision to hire this way is not a protected class.

However, when looked at more closely, the policy unduly rules out female applicants against male applicants since women tend to be custodial parents. This kind of policy would have a inequitable effect and is known as disparate impact. Title VII forbids disparate impact discrimination except in cases where the employer can confirm its policy is necessary for the business and must be done for the sake of the job.

The ADA classifies discrimination not just in terms of both disparate impact and treatment but also in terms of rejection to give rational accommodation to an otherwise competent individual with a disability.

Posted in Law and Issues | Tagged | Comments Off

Portfolio Management is Risky Business

A friend of mine (we’ll call him Al) was out looking at daycare centers with his wife. Their two year old daughter was ready to expand her horizons and learn the intricacies of social behavior and all the risks inherent in her new world. To Al’s dismay, no daycare center met the standards of control he would have expected in a daycare. This new world was fraught with risk. Doors weren’t locked and children could escape. Gates were not on the stairwell and children could fall and injure themselves. Peanut butter was in the fridge and children could access it. Al wasn’t willing to run the risk of introducing his daughter to this environment. Oddly enough, Al didn’t have similar controls in his own house. No childproof door locks, no stair gates, and peanut butter in his fridge – sometimes on the counter!!It was clear to me that a person will hold an unknown environment to a higher level of scrutiny than a person who is familiar with the same environment. It also became clear that a person’s experience will determine the amount of risk they are willing to tolerate. For example, if I put three people in Al’s deficient daycare and put a jar of peanut butter on the counter, the first person with no children may shrug their shoulders. The second person with a child may say, “Maybe we should remove the jar of peanut butter.” While the third person who has a child with a peanut allergy may say, “I need a peanut free environment for my child. This is unacceptable.” This dependency on individual experience and individual risk tolerance becomes a greater issue to organizations. When trying to ascertain the level of risk inherent in a project portfolio at an enterprise level, it is difficult to compare like with like without a risk management process and model that will represent the enterprise’s willingness to accept risk.The ProblemRisks that are not identified cannot be assessed. While an organization is dependent on a project manager to identify risks associated with a point in time project, there is no clear way to determine inherent risks to the organization. Organizations that have made the move to portfolio management have been successful at time management, resource management and time and budget status reporting at the portfolio level. While each of these advancements is a major achievement on its own, an organization that makes decisions on this data does so without a sense of risk associated with the performance of the portfolio. Decisions get made and risks are reacted to. Many issues are created due to unforeseen risks.So what is wrong with this picture? After all, risk is an accepted part of business and life for pretty much everyone.Risk is inherently a function of value and as such the more value at stake the more risk one is exposed to. Therefore, the notion that risk is a negative situation to be entirely avoided is a flawed argument, as this can only be guaranteed if/when an organization invests in cash cow initiatives where high value can be attained with no risk. We all know that cash cow initiatives are not sustainable and are the exception, not the rule.The ultimate argument is found in the financial market where stocks and bonds are valued by level of risk tolerance. Bonds are considered safer bets and therefore yield lower returns while stocks are considered risky investments and are expected to yield higher returns. Over the past 100 years the financial market has designed numerous mechanisms to manage the dynamics of risk and reward with continued lessons learned along the way.Independent of industry, size and source of funding (i.e. capital market, private equity, tax dollars), organizations must be well versed in balancing risk and reward if they are to survive and succeed in the competitive and volatile economy of the 21st century.With Risk Comes OpportunityThe old saying that “the apple does not fall far from the tree” rings true when one takes a moment to reflect on why risk management practices are at such an elementary level. The answer lies in what organizations have come to believe to be good project management.So what happens to managing risk? Risks become issues, issues become actions, and actions get managed using the same project management processes designed to manage the value line. The problem is that project management practices designed to deliver value are based on nomenclatures such as deliverables, milestones, performance indicators, quality, timeline, budget, approval, benefit realization, etc. These notions work perfectly for the value line where the lingo describes value-based characteristics.To manage risks, organizations need to invest in elevating their risk management practices to the project portfolio level, to attain the same level of maturity as project management practices. Otherwise, risk management will continue to be at the mercy of an individual project manager’s experience and will be managed well by a few and missed by most. This key concept drives the requirement for organizations to baseline their risk tolerance and provide their project management team with a consistent set of risk management standards and practices. Absence of risk management standards and practices will result in an environment of inconsistent risk tolerance and management, since project managers’ personal tolerance for risk will driver their approach for managing project risk. The danger of such a notion is that some project managers will have high tolerance for project risks while some will have lower tolerance, which might or might not be applicable to the priorities of the organization.We have all come to appreciate the necessities of standardized project management tools and methodology, and there are very few organizations that allow a project manager to use his/her own favorite project management tool and methodology. Risk management is no different, and organizations need to invest the same level of diligence in their risk management practices as they do in project management practices.The FrameworkThe identification of potential risks within a project portfolio is of major importance to a proactive risk assessment process. It provides the opportunities, indicators, and information that allows for identifying all risks, major and/or minor, before they adversely impact an organization. An aggregate view of project risks within a portfolio will provide organizations with a holistic assessment of all risks, provided that the risk identificationframework at the project level is comprehensive.The first step in risk assessment is to clearly and concisely express the risk in the form of a risk statement. A risk statement can be defined in the following terms:o The risk assessment statement outlines a state of affairs or attributes known as conditions that the project members feel may adversely impact the project.o The risk assessment statement also articulates the possibility of negative consequences resulting from the undesirable attribute or state of affairs.o This two-part formulation process for risk assessment statements has the advantage of coupling the idea of risk consequences with observable (and potentially controllable) risk conditions.When formulating a risk assessment statement, it is helpful to categorize the risk statement within categories that best reflect the priorities of the organization. The project portfolio Risk Registry (Table 1) outlines the risk statement associated with “strategy” risk category. The project portfolio Risk Registry will have most value when customized to reflect organization risk categories and corresponding risk statements.Once the project portfolio Risk Registry is vetted to reflect business priorities and challenges, the risk statements need to be evaluated against the probability and impact of actualization. The variable chosen to measure probability and impact of risk actualization reflects an organization language, as it is critical that baseline assessment is understood internally and represents organizational risk and exposure.A quadrant analysis of risk category actualization in terms of probability and impact provides the organization with transparent disclosure of risk at the project and portfolio level. This assessment enables an organization to attain a baseline understanding of project portfolio risk based on the organization’s own internal knowledge and experience.The risk analysis model is designed to expand and normalize project management judgment, used in the risk assessment model, and apply a consistent baseline for the probability and impact of all risk categories. It is composed of the following steps:1. Industry sources are used to establish a complete repository of threats that are applicable to the organizations.2. Industry sources are used to determine the organization’s vulnerability to industry threats. Then, the organization uses internal knowledge to narrow the list of vulnerabilities to those most applicable to the organization.3. To further validate the applicability and relevance of threats and vulnerabilities, a processes of “so what” analysis is conducted where the probability and impact of identified threats and vulnerabilities are further validated. The “so what” analysis utilizes metrics similar to the probability and impact metrics used in the risk assessment model.4. COBIT control statements are used to determine the level of controls that an organization has in place or could have in place in order to effectively manage the risk associated with outlined threats and vulnerabilities. Although COBIT controls are mostly designed for IT, indepth testing has revealed that COBIT controls are applicable to both IT and non-IT threats and vulnerabilities.The outcome of the analysis phase is a repository of threats, vulnerabilities and controls assessed and validated through a series of workshops, where project and portfolio managers input is given the same weight as industry best practices. This ensures that the analysis result is applicable to the organization rather than a hypothetical environment.An organization’s risk tolerance is directly influenced by its ability and desire to invest in controls designed to adjust risk tolerance. The action model provides the framework to operationalize risk assessment and risk analysis findings based on the implementation of controls that provide the best level of risk mitigation for project portfolio priorities.The action model leverages “so what” analysis to determine which controls provide the optimal mitigation results for threats/vulnerabilities with the highest probability of actualization and/or most implications. Furthermore, the action model provides the ability to assess the utility of existing controls in order to determine portability/reusability opportunities.The action model also enhances the reliability of the quadrant report produced in the risk assessment and risk analysis phases, and specifically identifies the value of investment in controls as a means to mitigate threat probability and vulnerability impact.In conclusion, the action model enables organizations to improve the effectiveness of processes used to deliver projects through investment in controls. The action model also develops roles, responsibilities and processes required to operationalize the risk assessment and risk analysis models in the form of specific actions. Roles such as Risk Manager and Risk Analyst are defined and incorporated into the business process. Each role in the risk management process has responsibility and accountability, and specific tasks within the risk assessment, risk analysis and risk action model. Finally, the action model enables organizations to establish pragmatic risk management processes.SummaryOrganizations are expected to manage risks and deliver high value capital projects. Anything else is considered sub-optimal performance. Delivering high-value projects requires a project management workforce with significant talent for effectively managing both the value line and risk line.Managing project risk is no different than managing investment risk. In both cases, the “customer” who provides the capital demands that the investment is managed by professionals who understand and leverage risks to maximize return on investment. Failing to do so ends in the “customer” finding other alternatives, as capital investment is a precious commodity.Tools designed to automate risk management become extremely valuable once organizations have understood and implemented the appropriate level of management processes for risk management. Unfortunately, many organizations fall into trap of buying pieces of technology, without having an in-depth understanding of the requirements and processes to use the technology.Organizations have the technology and talent to deliver high value projects through effective and transparent management of risks and need to establish the supporting risk management processes. Start with a framework designed to build an enabling risk management process to manage project portfolio risk relative to organizational requirements. If we can all agree on the tenants of risk in our respective organizations, we won’t have to suffer through miscalculation and mismanagement of risk.After my friend Al communicated his concerns to his wife, they together created a framework to identify acceptable risk for a daycare provider. They discussed why they didn’t hold their own home (the primary daycare) to the same standard. They determined how much they were willing to spend to mitigate certain risks and the likelihood of acceptable risk they were willing to bare. In the end, Al and his wife were able to select a daycare provider that provided the most reasonably safe environment for their child. In addition, they were able to develop a clear picture of some of the deficiencies in their own home environment and addressed them accordingly. The framework was critical in defining the conversation and providing them with a basis for discussion that ultimately enabled them to make an important choice. If only all organizations were run that way.

Posted in Uncategorized | Tagged | Comments Off

How an Operating System’s File System Works

File systems are an integral part of any operating systems with the capacity for long term storage. There are two distinct parts of a file system, the mechanism for storing files and the directory structure into which they are organised. In modern operating systems where it is possible for several user to access the same files simultaneously it has also become necessary for such features as access control and different forms of file protection to be implemented.A file is a collection of binary data. A file could represent a program, a document or in some cases part of the file system itself. In modern computing it is quite common for their to be several different storage devices attached to the same computer. A common data structure such as a file system allows the computer to access many different storage devices in the same way, for example, when you look at the contents of a hard drive or a cd you view it through the same interface even though they are completely different mediums with data mapped on them in completely different ways. Files can have very different data structures within them but can all be accessed by the same methods built into the file system. The arrangement of data within the file is then decided by the program creating it. The file systems also stores a number of attributes for the files within it.All files have a name by which they can be accessed by the user. In most modern file systems the name consists of of three parts, its unique name, a period and an extension. For example the file ‘bob.jpg’ is uniquely identified by the first word ‘bob’, the extension jpg indicates that it is a jpeg image file. The file extension allows the operating system to decide what to do with the file if someone tries to open it. The operating system maintains a list of file extension associations. Should a user try to access ‘bob.jpg’ then it would most likely be opened in whatever the systems default image viewer is.The system also stores the location of a file. In some file systems files can only be stored as one contiguous block. This has simplifies storage and access to the file as the system then only needs to know where the file begins on the disk and how large it is. It does however lead to complications if the file is to be extended or removed as there may not be enough space available to fit the larger version of the file. Most modern file systems overcome this problem by using linked file allocation. This allows the file to be stored in any number of segments. The file system then has to store where every block of the file is and how large they are. This greatly simplifies file space allocation but is slower than contiguous allocation as it is possible for the file to be spread out all over the disk. Modern operating systems overome this flaw by providing a disk defragmenter. This is a utility that rearranges all the files on the disk so that they are all in contiguous blocks.Information about the files protection is also integrated into the file system. Protection can range from the simple systems implemented in the FAT system of early windows where files could be marked as read-only or hidden to the more secure systems implemented in NTFS where the file system administrator can set up separate read and write access rights for different users or user groups. Although file protection adds a great deal of complexity and potential difficulties it is essential in an environment where many different computers or user can have access to the same drives via a network or time shared system such as raptor.Some file systems also store data about which user created a file and at what time they created it. Although this is not essential to the running of the file system it is useful to the users of the system.In order for a file system to function properly they need a number of defined operations for creating, opening and editing a file. Almost all file systems provide the same basic set of methods for manipulating files.A file system must be able to create a file. To do this there must be enough space left on the drive to fit the file. There must also be no other file in the directory it is to be placed with the same name. Once the file is created the system will make a record of all the attributes noted above.Once a file has been created we may need to edit it. This may be simply appending some data to the end of it or removing or replacing data already stored within it. When doing this the system keeps a write pointer marking where the next write operation to the file should take place.In order for a file to be useful it must of course be readable. To do this all you need to know the name and path of the file. From this the file system can ascertain where on the drive the file is stored. While reading a file the system keeps a read pointer. This stores which part of the drive is to be read next.In some cases it is not possible to simply read all of the file into memory. File systems also allow you to reposition the read pointer within a file. To perform this operation the system needs to know how far into the file you want the read pointer to jump. An example of where this would be useful is a database system. When a query is made on the database it is obviously inefficient to read the whole file up to the point where the required data is, instead the application managing the database would determine where in the file the required bit of data is and jump to it. This operation is often known as a file seek.File systems also allow you to delete files. To do this it needs to know the name and path of the file. To delete a file the systems simply removes its entry from the directory structure and adds all the space it previously occupied to the free space list (or whatever other free space management system it uses).These are the most basic operations required by a file system to function properly. They are present in all modern computer file systems but the way they function may vary. For example, to perform the delete file operation in a modern file system like NTFS that has file protection built into it would be more complicated than the same operation in an older file system like FAT. Both systems would first check to see whether the file was in use before continuing, NTFS would then have to check whether the user currently deleting the file has permission to do so. Some file systems also allow multiple people to open the same file simultaneously and have to decide whether users have permission to write a file back to the disk if other users currently have it open. If two users have read and write permission to file should one be allowed to overwrite it while the other still has it open? Or if one user has read-write permission and another only has read permission on a file should the user with write permission be allowed to overwrite it if theres no chance of the other user also trying to do so?Different file systems also support different access methods. The simplest method of accessing information in a file is sequential access. This is where the information in a file is accessed from the beginning one record at a time. To change the position in a file it can be rewound or forwarded a number of records or reset to the beginning of the file. This access method is based on file storage systems for tape drive but works as well on sequential access devices (like mordern DAT tape drives) as it does on random-access ones (like hard drives). Although this method is very simple in its operation and ideally suited for certain tasks such as playing media it is very inefficient for more complex tasks such as database management. A more modern approach that better facilitates reading tasks that aren’t likely to be sequential is direct access. direct access allows records to be read or written over in any order the application requires. This method of allowing any part of the file to be read in any order is better suited to modern hard drives as they too allow any part of the drive to be read in any order with little reduction in transfer rate. Direct access is better suited to to most applications than sequential access as it is designed around the most common storage medium in use today as opposed to one that isn’t used very much anymore except for large offline back-ups. Given the way direct access works it is also possible to build other access methods on top of direct access such as sequential access or creating an index of all the records of the file speeding to speed up finding data in a file.On top of storing and managing files on a drive the file system also maintains a system of directories in which the files are referenced. Modern hard drives store hundreds of gigabytes. The file system helps organise this data by dividing it up into directories. A directory can contain files or more directories. Like files there are several basic operation that a file system needs to a be able to perform on its directory structure to function properly.It needs to be able to create a file. This is also covered by the overview of peration on a file but as well as creating the file it needs to be added to the directory structure.When a file is deleted the space taken up by the file needs to be marked as free space. The file itself also needs to be removed from the directory structure.Files may need to be renamed. This requires an alteration to the directory structure but the file itself remains un-changed.List a directory. In order to use the disk properly the user will require to know whats in all the directories stored on it. On top of this the user needs to be able to browse through the directories on the hard drive.Since the first directory structures were designed they have gone through several large evolutions. Before directory structures were applied to file systems all files were stored on the same level. This is basically a system with one directory in which all the files are kept. The next advancement on this which would be considered the first directory structure is the two level directory. In this There is a singe list of directories which are all on the same level. The files are then stored in these directories. This allows different users and applications to store there files separately. After this came the first directory structures as we know them today, directory trees. Tree structure directories improves on two level directories by allowing directories as well as files to be stored in directories. All modern file systems use tree structure directories, but many have additional features such as security built on top of them.Protection can be implemented in many ways. Some file systems allow you to have password protected directories. In this system. The file system wont allow you to access a directory before it is given a username and password for it. Others extend this system by given different users or groups access permissions. The operating system requires the user to log in before using the computer and then restrict their access to areas they dont have permission for. The system used by the computer science department for storage space and coursework submission on raptor is a good example of this. In a file system like NTFS all type of storage space, network access and use of device such as printers can be controlled in this way. Other types of access control can also be implemented outside of the file system. For example applications such as win zip allow you to password protect files.There are many different file systems currently available to us on many different platforms and depending on the type of application and size of drive different situations suit different file system. If you were to design a file system for a tape backup system then a sequential access method would be better suited than a direct access method given the constraints of the hardware. Also if you had a small hard drive on a home computer then there would be no real advantage of using a more complex file system with features such as protection as it isn’t likely to be needed. If i were to design a file system for a 10 gigabyte drive i would use linked allocation over contiguous to make the most efficient use the drive space and limit the time needed to maintain the drive. I would also design a direct access method over a sequential access one to make the most use of the strengths of the hardware. The directory structure would be tree based to allow better organisation of information on the drive and would allow for acyclic directories to make it easier for several users to work on the same project. It would also have a file protection system that allowed for different access rights for different groups of users and password protection on directories and individual files.Several file systems that already implement the features I’ve described above as ideal for a 10gig hard drive are currently available, these include NTFS for the Windows NT and XP operating systems and ext2 which is used in linux.Best Regards,Sam Harnett MSc mBCSPixeko Studio – Web Developers in Kent

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , | Comments Off