Equality, Community and Continuity: Reviewing the UK Rules for Constituency Redistributions – Part 2

The review of Parliamentary constituencies that ended prematurely in 2013 would have resulted in most of the 600 seats contested at the 2015 general election being very different from the current 650. In this second blog based on their research Ron Johnston, David Rossiter and Charles Pattie outline why electoral quotas, rather than a reduction in the number of MPs, would be the primary cause of disruption in a boundary review.

The redistributions undertaken by the Boundary Commissions in 2011-2013 were aborted by Parliament for political reasons before their completion, so the 2015 general election will be fought in the current constituencies. But implementation of the 2011 Act was merely delayed until 2016 and a new set of reviews initiated then will, if conducted under the same Rules for Redistributions, be as disruptive to the current map of constituencies as those aborted in 2013. However, this will be primarily due to attempts to introduce electoral quotas, rather than any reduction in the number of MPs.

In seeking to reduce the number of MPs, from 650 to 600, the coalition government was less concerned with the impact on boundaries than in reducing the cost of Parliament.  But as MPs and others saw seats disappear from the map amid the general disruption, the two issues became somewhat conflated.  Surely this reduction had to be part of the cause?  Our research suggests the impact was slight, however.  A few more seats might have escaped change had the number of MPs not been altered, but the causes (and possible solutions) of the major disruption were elsewhere.

The imposition of a single electoral quota plus the reduction in the number of MPs meant that the formerly over-represented parts of the United Kingdom would experience larger decreases in their Parliamentary delegations than others: the number of Welsh MPs would decline from 40 to 30 (a 25 per cent reduction) and Scottish MPs from 59 to 52 (a 12 per cent loss); Northern Ireland’s decline was from 18 to 16 (11 per cent) and England’s from 533 to 502 (6 per cent).

Read more of this post

Equality, Community and Continuity: Reviewing the UK Rules for Constituency Redistributions – Part 1

The review of Parliamentary constituencies that ended prematurely in 2013 would have resulted in most of the 600 seats contested at the 2015 general election being very different from the current 650. The potential disruption alarmed many MPs and party organisations. In the first blog based on their recently published research, Ron Johnston, David Rossiter and Charles Pattie assess whether changing the rules for defining constituencies could reduce the disruption to the map of constituencies.

Concern regarding variations in constituency electorates, coupled with a drive to cut the cost of Parliament in the wake of the 2009 expenses scandal, stimulated Conservative Party commitments in its 2010 General Election manifesto to legislate to ‘ensure every vote will have equal value’ and reduce the size of the House of Commons.

Legislation passed in 2011 put that intention into practice and the Boundary Commissions commenced their task of producing a new set of 600 constituencies all, with the exception of four special cases, having electorates within +/-5% of the UK average. They consulted on their proposals and revised them accordingly, but their work was halted by Parliament before its completion because of disagreements within the coalition on the programme of constitutional change. By then, however, MPs and party organisations had become aware that the new system, with its emphasis on electoral equality, disrupted the existing map of constituencies very significantly. Fully 54% of the current seats would be subject to major change, compared to only 30% at the last review, and many more constituencies would cross local government boundaries than previously.

The Boundary Commissions are currently required to begin their task again in 2016, in order to produce a new set of constituencies for the 2020 general election. But the very disruptive consequences of the previous exercise generated questions regarding the nature of the new procedure. Would it be possible to reduce the disruption substantially, yet maintain the general principle of electoral equality, with a more relaxed tolerance around the average? And would there be less disruption if the number of MPs was retained at the current 650, rather than reducing it to 600?

Our research answering these two questions has been recently published, and can be downloaded from the McDougall Trust website. It found that a more relaxed tolerance would reduce the disruption somewhat, but at least one-third of all constituencies would almost certainly have to experience major change – regardless of the size of the House of Commons.

Read more of this post

What does the public really think about democracy in Britain?

New data from the European Social Survey shows that while the British public value democracy many feel the UK government is failing to live up to its democratic ideals. Sarah Butt explores the key findings.

In response to the recent alleged “Trojan Horse” plot to radicalise pupils in Birmingham schools, Education Secretary Michael Gove has called for British values including democracy and the rule of law to be placed at the heart of the National Curriculum.  But what does living in a liberal democracy actually involve? And how confident are we that democracy in Britain lives up to these ideals?  New findings from the European Social Survey (ESS) provide an in-depth look at how well the British public feel democracy in Britain delivers what they think matters most.

Perhaps unsurprisingly the vast majority of people in Britain think that it is important to live in a country that is governed democratically (average importance rating 8.4 out of 10).  However, people are more ambivalent about whether Britain is actually democratic (average importance rating 6.6 out of 10). A significant minority of people (26 %) do not rate Britain above five out of 10 on the democracy scale. There is evidence therefore of a democratic deficit.

High Expectations

The ESS reveals that people have high expectations of democracy.  The survey asked respondents to rate how important – on a scale from 0 to 10 – they considered a number of different attributes to be for democracy.   Most attributes received an average rating of at least eight out of 10 with people believing that democracy, in addition to guaranteeing free and fair elections and protecting civil liberties, should also protect people against poverty and involve citizens in decision-making.

Read more of this post

Reigns in Spain and the ‘A’ word (again) in the UK

Robert Morris explains why the abdication of the Spanish King is unlikely to lead to a similar move by Queen Elizabeth II.

The recent announcement of the abdication of King Juan Carlos of Spain in favour of his heir, Felipe, has renewed discussion about abdication in the UK. Indeed, the abdicating King – anxious no doubt to make the best of a not very happy job – is reported as saying: ‘I don’t want my son to grow old waiting like Prince Charles’. Despite substantial demonstrations in favour of a republic, the abdication seems to be proceeding.

Will it happen here? Will Elizabeth II make way for her heir, Prince Charles? The present consensus is that it will not. This is hardly news. But there are two new twists offered on the usual account that it will not happen because the Queen believes she has to serve for the whole of her life.

Religion makes abdication impossible

The first twist is the suggestion by the Daily Telegraph that abdication is actually impossible because, unlike the Spaniards, the Queen has been consecrated in the religious ceremony of the coronation and the British monarchy is therefore ‘a sacerdotal system’. This was not the case with her uncle, Edward VIII, because he left the throne before becoming an anointed ruler as the result of a coronation ceremony. He was, however, undoubtedly King – a fact of law in no way dependent on coronation. This fact may be taken to emphasise that in UK law the sovereign occupies first and foremost a secular public office.

Read more of this post

The grass roots are withering and the money is drying up – so what is the future for local parties in general election campaigns?

Ron Johnston and Charles Pattie discuss the evolution constituency campaigning in the UK. Their book Money and electoral politics: Local Parties and Funding in General Elections was published earlier this month.

With the 2015 general election now less than a year away, political parties will again be focusing on funding of their campaigns. As in previous elections, candidates will need two resources to sustain their general election campaigns – people and money. Each is in increasingly short supply. As a result, the nature of constituency campaigning has changed very substantially in recent decades, and is likely to do so even more in the future.

People are needed to manage the constituency campaign and to promote the candidate’s/party’s cause across the local electorate: as the average constituency has some 70,000 voters, this means reaching a large number of people. In the past, most candidates could rely on activists drawn from their party’s local members, but as their numbers have declined the available pool has been reduced. Some candidates have replaced them by supporters – non-members who are nevertheless willing to promote the party’s cause – and by volunteers from nearby constituencies where there is an excess of supply relative to demand.

Money is needed to sustain the campaign organisation – its office and equipment, plus staffing – but in particular to meet the costs of posters and leaflets. Research has clearly shown that the more intensive the local campaign, as indicated by the amount that the candidate spends on those items, the better the performance: those who spend more tend to get more votes, and their opponents get less.

Read more of this post

Is Britain a Christian country and, whatever the case, what then?

Unusually, British politicians have been talking about religion this Easter.

(i) Events, dear Boy

First, the Communities and Local Government Secretary, Eric Pickles, whose Department leads on faith relations, and then the Prime Minister, David Cameron, both averred that Britain was still a Christian country – Mr Pickles, with customary brutality, reminding us that there is an established Church and advising people to ‘get over’ that fact. A large number of worthies then wrote jointly to the Daily Telegraph (editorially sympathetic to establishment) to challenge ministers’ views, labelling them as both false and divisive in a pluralised society of multiple belief and unbelief. This was countered by a joint letter disagreeing

This lukewarm pot was then stirred by the Deputy Prime Minister and leader of the Liberal Democrat party, Nick Clegg. Out of the blue in a radio programme, he floated the thought that the day was coming when church establishment should be stood down for everyone’s benefit, including that of the Church of England. The Prime Minister and others immediately rejected this view – long Liberal Democrat policy deriving from that party’s ancient Christian Nonconformist roots.

Understandably, the Archbishop of Canterbury, Justin Welby, head of the church established in England (and long ago disestablished in Ireland and Wales) felt moved also to comment – no tablets of stone, just a blog. Acknowledging that church attendance had greatly declined, he maintained that nonetheless much of the nation’s life had been ‘shaped and founded on Christianity’, and that ‘in the general sense of being founded on Christian faith, this is a Christian country’. Characterising objectors as atheists, he pointed to Muslim, Hindu and Sikh support for the Prime Minister’s remarks. This claim, which has been called ‘Anglican multifaithism’ [N. Bonney (2013) Monarchy, religion and the state], is a trope employed by Anglicans to assume a new role and purport to speak for the interests of all religions. On offer is an implied conduit into government valued apparently by a number of non-Christian faiths but not willingly by minority Christian denominations.

Read more of this post

Jenny Watson’s lecture on the modernisation of the electoral administration system

In the latest Constitution Unit seminar, Jenny Watson, the Chair of the Electoral Commission, provided the audience with a very eloquent account of the challenges and opportunities presented by the imminent and future work towards electoral modernisation. Drawing upon the effective steps that have already been taken by the Labour administration and most recently the coalition government, she elaborated on the likely effects of the new legislation including the transition to Individual Electoral Registration and emphasised the imperative need for the further modernisation of the electoral administration system.

The Electoral Commission has always played a vital role towards that direction through a number of proposals and recommendations aiming to improve the election process. But it is the need for comprehensive legislation that will create clarity and transparency and ensure that ‘confidence and the effectiveness of our system will be maintained’ as Watson noted. A major step was taken in 2013 with the Electoral Registration and Administration Act which replaced Household Electoral Registration (HER) with Individual Electoral Registration (IER) and introduced new close of poll arrangements. It is expected that the move to IER will improve the security of the registration process and increase registration mainly among younger voters, students and the mobile population. However, in an increasingly disenfranchised society, there is an urgent need to reform the electoral framework, making it more efficient and less complex. As Jenny Watson highlighted the Electoral Commission will be leading the way in order to find the best ways to modernise the system and ‘make it more reflective of the wider society’.

Read more of this post

A Code of Constitutional Standards

The Constitution Unit of University College London is today publishing a report which sets out a code of constitutional standards based on the reports of the House of Lords Select Committee on the Constitution. Since 2001 the Committee has made many recommendations in its reports, and the goal of this report was to codify these recommendations in order to make the Committee’s analysis of the constitution more accessible. The report, by Robert Hazell, Dawn Oliver and myself, contains a code of 126 constitutional standards, each of which is relevant to the legislative process, and each of which has been extracted from the 149 reports of the Constitution Committee that were reviewed. The standards are organised into five sections: the rule of law; delegated powers, delegated legislation and Henry VIII clauses; the separation of powers; individual rights; and parliamentary procedure.

The Constitution Committee’s formal terms of reference were set by the House of Lords Liaison Committee when it was established in 2001 and have not changed since then: ‘to examine the constitutional implications of all public Bills coming before the House; and to keep under review the operation of the constitution’. The Constitution Committee decided against drawing up a formalised code of constitutional norms in their first report to inform their bill scrutiny, instead the Committee adopted a pragmatic approach. The Committee identifies the norms that are relevant to each particular bill or inquiry in question. This flexible approach has a number of advantages, but one disadvantage is that the Committee’s conception of the normative foundations of the constitution is not easily accessible.

The first aim of the code in this report is to make the normative foundations of the Committee’s work more accessible. As part of their work, the Committee has made choices about what the constitution means in the context of the legislative process. It is these choices that the code seeks to highlight. It is important to note that the Committee advanced many of the cited standards in relation to particular bills, and did not put them forward as generalised standards. There is little doubt that if the Committee were to advance its own code of constitutional standards, it would look different to the code within this report. Nevertheless, the code does represent an accurate summary of the constitutional norms that the Committee has sought to uphold in its work since it was established in 2001.

In terms of the content of the code, it is noteworthy that many of the standards appear to be derived from the principles that underpin the parliamentary process. For example, standards that seek to regulate the use of fast-track legislation are not just general principles of good governance, nor are they are based on a particular constitutional principle, but rather they are derived from the normative foundations of the parliamentary process itself. Such standards serve to protect the integrity of the parliamentary process. This focus on parliamentary norms demonstrates the value of giving a parliamentary committee the task of assessing the constitutional implications of government bills. It has enabled the Committee to articulate the normative implications of the principles that form the foundations of the parliamentary process.

The second aim of the code is to provide a resource for those involved in the legislative process. It is widely recognised that one of the disadvantages of the United Kingdom’s uncodified constitution is that it is not easily accessible, and within Parliament the task of pointing out the constitutional implications of bills often falls to constitutional experts, particularly in the Lords, and the relevant committees. If the norms of the constitution were more readily accessible, it would be reasonable to expect more parliamentarians to engage with them during the legislative process. By publishing this code, it is hoped that parliamentarians, and others involved in the law-making process will make use of the standards within it during their scrutiny. The code might also be used by the Constitution Committee to develop its own code of legislative or constitutional standards.

The third aim is to contribute to the debate on the value of legislative standards within the legislative process in Westminster. In an earlier blog post, I put forward a critique of the code of legislative standards developed by the House of Commons Select Committee on Political and Constitutional Reform in their report titled ‘Ensuring standards in the quality of legislation’. In that post, I argued that although their code would represent a significant step forward, I thought it did not go far enough. Since that post, the Government has issued its response to the PCRC’s report. The Government could not be clearer – it does not think that a code of legislative standards is a good idea (paras 12-15). It suggests that the Cabinet Guide to Making Legislation is all that is needed for parliamentarians to judge the standard of the Government’s approach. Further, the Government argued that the PCRC’s code would risk encouraging a ‘box-ticking mentality’, and they point out that the code does not provide the ‘degree of objectivity it envisages.’ The latter point is surprising because the PCRC’s code makes every effort to be as ‘neutral’ as possible.

The Government appears to have misinterpreted the rationale for a code of soft law standards. The idea is to stimulate parliamentary debate on aspects of bills to which the standards relate, rather than to introduce an objective box-ticking exercise. The presence of parliamentary sovereignty and the absence of a codified constitution are sometimes taken to mean that Government and Parliament legislate into a normative vacuum. That somehow parliamentary sovereignty means that the government does not have to justify why a bill seeks to depart from the existing norms of the constitution. That idea, as Murray Hunt has recently argued in Parliament and the Law, is antithetical to any meaningful idea of constitutionalism. A code of constitutional standards is designed to challenge the myth of the normative vacuum and to raise the standard of justification within the legislative process, but without legally limiting Parliament’s legislative capacity. In this sense a code of soft law standards does not represent a threat to the political nature of the legislative process, as the code would always the subject of debate, and could be changed by purely political means. Soft law constitutional standards developed within Parliament might even find support from political constitutionalists, because they serve to enhance the quality of parliamentary debate by focusing the minds of parliamentarians on the value of the political process and the norms that form its basic architecture. Even if the standards are prescriptive, this does not mean that they cannot be departed from. The value of a code of soft law standards does not depend on them being complied with all of the time, instead it depends on then being used as the basis for debate and justification within the legislative process.

There seems to be little to lose and everything to gain from making more use of soft law codes of standards in Westminster. As this code demonstrates, committees within Parliament are already articulating the normative standards that are vital to the integrity of the parliamentary process. The challenge is to maximise the benefits of this work by making those standards as accessible and as influential as possible. It is hoped that this code makes a small contribution to this aim.

Directly Querying the Constitute Data

Thank you to all who attended my seminar today.  As promised, I am going to provide the code that I used to query the data underlying the Constitute site.

To start, you will need to know how to write a SPARQL query.  There are good resources online to teach you how to write such queries (see here or here).  Once you know a bit about writing SPARQL queries, you can test out your skills on the data underlying the Constitute site.  Just copy and paste your queries here.  To get you started, here are the two queries that I used in my seminar:

Query 1:

PREFIX ontology:<http://tata.csres.utexas.edu:8080/constitute/ontology/>
PREFIX rdfs:<http://www.w3.org/2000/01/rdf-schema#>
SELECT ?const ?country ?year
WHERE {
?const ontology:isConstitutionOf ?country .
?const ontology:yearEnacted ?year .
}

Query 2:

PREFIX ontology:<http://tata.csres.utexas.edu:8080/constitute/ontology/>
PREFIX rdfs:<http://www.w3.org/2000/01/rdf-schema#>
SELECT ?const ?country ?year ?region ?sectionType ?sectionText ?childType ?childText
WHERE {
?const ontology:isConstitutionOf ?country .
?const ontology:yearEnacted ?year .
?section ontology:isSectionOf ?const .
?country ontology:isInGroup ?region .
?section ontology:hasTopic ontology:env .
?section ontology:rowType ?sectionType .
OPTIONAL {?section ontology:text ?sectionText}
OPTIONAL {?childSection ontology:parent ?section . ?childSection ontology:text ?childText}
OPTIONAL {?childSection ontology:parent ?section . ?childSection ontology:rowType ?childType}
}

Notice the “topic” line in the second query (?section ontology:hasTopic ontology:env .).  The env part of that line is the tag that we use to indicate provisions that deal with “Protection of environment”.  You can explore the list of topics included on Constitute and their associated tags here.

The next step is to apply your querying knowledge using the SPARQL package in R.  I will demonstrate how this is done by walking you through the creation of the Word Cloud that I discussed during my seminar (the code for the histogram is easier to understand and is below).  First, query the SPARQL endpoint using R:

#Opens the Relevant Libraries
library(SPARQL)
#Defines URL for Endpoint
endpoint <- "http://tata.csres.utexas.edu:8080/openrdf-sesame/repositories/test"
#Defines the Query
query <- "PREFIX ontology:<http://tata.csres.utexas.edu:8080/constitute/ontology/>
PREFIX rdfs:<http://www.w3.org/2000/01/rdf-schema#>
SELECT ?const ?country ?year ?region ?sectionType ?sectionText ?childType ?childText
WHERE {
?const ontology:isConstitutionOf ?country .
?const ontology:yearEnacted ?year .
?section ontology:isSectionOf ?const .
?country ontology:isInGroup ?region .
?section ontology:hasTopic ontology:env .
?section ontology:rowType ?sectionType .
OPTIONAL {?section ontology:text ?sectionText}
OPTIONAL {?childSection ontology:parent ?section . ?childSection ontology:text ?childText}
OPTIONAL {?childSection ontology:parent ?section . ?childSection ontology:rowType ?childType}
}"

#Queries the endpoint
sparql <- SPARQL(endpoint,query,ns=c('ontology','<http://tata.csres.utexas.edu:8080/constitute/ontology/>','const','<http://tata.csres.utexas.edu:8080/constitute/>'))

You now have a data table with the relevant textual data available to you in R under sparql$results.  The second step is to organize that data into a corpus using the text mining package (tm, for short).  Ultimately, I am only interested in rows in the data table that have text (i.e. the sectoinText and childText columns are not empty) and that are from constitutions written in the Americas or Africa, so I will filter the data along these lines in this step of the process.  Here is the code:

#Opens the Relevant Libraries
library(tm)
library(SnowballC)

#Filters Out Correct Regions
data.Africa <- subset(sparql$results,sparql$results$region=="const:ontology/Africa")
data.Americas <- subset(sparql$results,sparql$results$region=="const:ontology/Americas")

#Extracts Section Text from Results and Removes Missing Values
sText.Africa <- subset(data.Africa,data.Africa$sectionText!='NA')
sText.Africa <- subset(sText.Africa$sectionText,sText.Africa$sectionType=="const:ontology/body")
sText.Americas <- subset(data.Americas,data.Americas$sectionText!='NA')
sText.Americas <- subset(sText.Americas$sectionText,sText.Americas$sectionType=="const:ontology/body")

#Extracts Child Section Text from Results and Removes Missing Values
cText.Africa <- subset(data.Africa,data.Africa$childText!='NA')
cText.Africa <- subset(cText.Africa$childText,cText.Africa$childType=="const:ontology/body")
cText.Americas <- subset(data.Americas,data.Americas$childText!='NA')
cText.Americas <- subset(cText.Americas$childText,cText.Americas$childType=="const:ontology/body")

#Appends Parent and Child Text Together
Text.Africa <- data.frame(c(sText.Africa,cText.Africa))
Text.Americas <- data.frame(c(sText.Americas,cText.Americas))

#Converts Data Frames to Corpora
corpus.Africa <- Corpus(VectorSource(Text.Africa))
corpus.Americas <- Corpus(VectorSource(Text.Americas))

Now that I have organized the relevant text into corpora, I need to clean those corpora by removing stop words (e.g. a, an and the), punctuation and numbers and stemming words.  This is standard practice before analyzing text to prevent “the” from being the largest word in my word cloud and to make sure that “right” and “rights” are not counted separately.  The tm package has all the tools to perform this cleaning.  Here is the code:

#Makes All Characters Lower-Case
corpus.Africa <- tm_map(corpus.Africa,tolower)
corpus.Americas <- tm_map(corpus.Americas,tolower)

#Removes Punctuation
corpus.Africa <- tm_map(corpus.Africa,removePunctuation)
corpus.Americas <- tm_map(corpus.Americas,removePunctuation)

#Removes Numbers
corpus.Africa <- tm_map(corpus.Africa,removeNumbers)
corpus.Americas <- tm_map(corpus.Americas,removeNumbers)

#Removes Stopwords
corpus.Africa <- tm_map(corpus.Africa,removeWords,stopwords('english'))
corpus.Americas <- tm_map(corpus.Americas,removeWords,stopwords('english'))

#Stems Words
dict.corpus.Africa <- corpus.Africa
corpus.Africa <- tm_map(corpus.Africa,stemDocument)
corpus.Africa <- tm_map(corpus.Africa,stemCompletion,dictionary=dict.corpus.Africa)
dict.corpus.Americas <- corpus.Americas
corpus.Americas <- tm_map(corpus.Americas,stemDocument)
corpus.Americas <- tm_map(corpus.Americas,stemCompletion,dictionary=dict.corpus.Americas)

The last step is to analyze the cleaned text.  I created a simple word cloud, but you could perform even more sophisticated text analysis techniques to the textual data on Constitute.  I used the wordcloud package to accomplish this task.  Here is the code I used to create the word clouds for my presentation:

#Opens the Relevant Libraries
library(wordcloud)
library(RColorBrewer)
library(lattice)

#Creates a PNG Document for Saving
png(file="WC_env.png", height = 7.5, width = 14, units = "in", res=600, antialias = "cleartype")

#Sets Layout
layout(matrix(c(1:2), byrow = TRUE, ncol = 2), widths = c(1,1), heights = c(1,1), respect = TRUE)

#Sets Overall Options
par(oma = c(0,0,5,0))

#Selects Colors
pal <- brewer.pal(8,"Greys")
pal <- pal[-(1:3)]

#Word Cloud for the Americas
wordcloud(corpus.Americas,scale=c(3,0.4),max.words=Inf,random.order=FALSE,rot.per=0.20,colors=pal)

#Creates Title for Americas Word Cloud
mtext("Americas",side=3,cex=1.25,line=4)

#Word Cloud for Africa
wordcloud(corpus.Africa,scale=c(3,0.4),max.words=Inf,random.order=FALSE,rot.per=0.20,colors=pal)

#Creates Title for African Word Cloud
mtext("Africa",side=3,cex=1.25,line=4)

#Creates an Overall Title for the Figure
mtext("Constitutional Provisions on the Environment",outer=TRUE,cex=2,font=2,line=1.5)

#Closes the Plot
dev.off()

Note that the plotting commands above are complicated by the fact that I wanted to combine two word clouds into the same image.  Had I only wanted to create a single word cloud, say for Africa, and did not care much about the colors of the plot, the following commands would have sufficed:

#Opens the Relevant Libraries
library(wordcloud)

#Word Cloud for Africa
wordcloud(corpus.Africa,scale=c(3,0.4),max.words=Inf,random.order=FALSE,rot.per=0.20,colors=pal)

Anyway, here is the resulting word cloud:

WordClouds_Env

With the commands above, you should be able to replicate the word clouds from my seminar.  In addition, minor modifications to the commands above will allow you to describe the constitutional provisions on different topics or to compare the way that different regions address certain topics.  One could even perform more advanced analyses of these texts with the SPARQL queries outlined above.

CODE FOR HISTOGRAM

#Opens the Relevant Libraries
library(SPARQL)
library(RColorBrewer)

#Defines URL for Endpoint
endpoint <- "http://tata.csres.utexas.edu:8080/openrdf-sesame/repositories/test"

#Defines the Query
query <- "PREFIX ontology:<http://tata.csres.utexas.edu:8080/constitute/ontology/>
PREFIX rdfs:<http://www.w3.org/2000/01/rdf-schema#>
SELECT ?const ?country ?year
WHERE {
?const ontology:isConstitutionOf ?country .
?const ontology:yearEnacted ?year .
}"

#Makes the Query
sparql <- SPARQL(endpoint,query,ns=c('ontology','<http://tata.csres.utexas.edu:8080/constitute/ontology/>','const','<http://tata.csres.utexas.edu:8080/constitute/>'))

#Subsets Data
data.year <- data.frame(subset(sparql$results,select=c("const","year")))

#Drops Duplicate Observations
data.year <- unique(data.year)

#Makes Year Numeric
year <- as.numeric(data.year$year)

#Creates PNG Document for Saving
png(file="Histogram_Year.png")

#Selects Colors
pal <- brewer.pal(3,"Greys")
pal <- pal[-(1:2)]

#Histogram Command
hist(year, breaks=21, col = pal, border = pal, xlab = "Promulgation Year", ylab = "Number of Constitutions", ylim = c(0,60), xlim = c(1790,2010), main = "Constitutions on Constitute")

#Closes the Plot
dev.off()

And here it is:

Histogram_Year

Follow

Get every new post delivered to your Inbox.

Join 3,286 other followers